Apr 22 19:54:53.769893 ip-10-0-139-10 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:54:53.769908 ip-10-0-139-10 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:54:53.769918 ip-10-0-139-10 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:54:53.770431 ip-10-0-139-10 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:55:03.821496 ip-10-0-139-10 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:55:03.821512 ip-10-0-139-10 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 294db9864a544b34a83604d6a8a54405 -- Apr 22 19:57:32.303575 ip-10-0-139-10 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:57:32.698757 ip-10-0-139-10 kubenswrapper[2548]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:32.698757 ip-10-0-139-10 kubenswrapper[2548]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:57:32.698757 ip-10-0-139-10 kubenswrapper[2548]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:32.698757 ip-10-0-139-10 kubenswrapper[2548]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:57:32.698757 ip-10-0-139-10 kubenswrapper[2548]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:32.701641 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.701548 2548 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:57:32.707741 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707722 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:32.707741 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707741 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707745 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707748 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707751 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707754 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707757 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707759 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707762 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707764 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707767 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707769 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707772 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707783 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707786 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707789 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707793 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707796 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707798 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707802 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:32.707826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707805 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707808 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707810 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707812 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707815 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707818 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707820 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707823 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707825 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707828 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707831 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707834 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707836 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707839 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707842 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707845 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707847 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707849 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707852 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707854 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707857 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:32.708297 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707859 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707861 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707864 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707866 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707869 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707874 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707877 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707880 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707882 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707885 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707887 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707890 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707892 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707896 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707899 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707902 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707904 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707907 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707909 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:32.708866 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707912 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707914 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707917 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707919 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707921 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707924 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707926 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707929 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707931 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707933 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707936 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707938 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707941 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707943 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707946 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707948 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707951 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707954 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707957 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707960 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:32.709351 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707963 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707966 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707968 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707971 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707973 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.707975 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708395 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708401 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708403 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708406 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708410 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708414 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708417 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708419 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708422 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708425 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708428 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708430 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708433 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708435 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:32.709823 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708438 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708441 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708443 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708446 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708448 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708451 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708454 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708457 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708459 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708462 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708465 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708468 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708471 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708473 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708476 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708478 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708481 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708483 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708485 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708489 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:32.710318 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708492 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708494 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708497 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708499 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708501 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708504 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708506 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708508 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708511 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708513 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708516 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708518 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708523 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708526 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708529 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708533 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708535 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708538 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708541 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708543 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:32.710826 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708546 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708548 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708551 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708553 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708557 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708559 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708562 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708564 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708566 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708569 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708571 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708573 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708576 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708578 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708580 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708583 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708586 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708588 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708591 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708593 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:32.711370 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708596 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708598 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708601 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708603 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708605 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708608 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708610 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708612 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708615 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708617 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708621 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.708623 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709940 2548 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709950 2548 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709957 2548 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709961 2548 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709966 2548 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709973 2548 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709978 2548 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709982 2548 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709985 2548 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:57:32.711862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709988 2548 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709992 2548 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709995 2548 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.709998 2548 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710002 2548 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710004 2548 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710007 2548 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710010 2548 flags.go:64] FLAG: --cloud-config="" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710013 2548 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710016 2548 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710024 2548 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710027 2548 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710030 2548 flags.go:64] FLAG: --config-dir="" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710033 2548 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710037 2548 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710041 2548 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710044 2548 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710046 2548 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710050 2548 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710056 2548 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710059 2548 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710063 2548 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710066 2548 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710069 2548 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710073 2548 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:57:32.712428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710076 2548 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710080 2548 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710082 2548 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710087 2548 flags.go:64] FLAG: --enable-server="true" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710090 2548 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710095 2548 flags.go:64] FLAG: --event-burst="100" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710098 2548 flags.go:64] FLAG: --event-qps="50" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710101 2548 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710104 2548 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710107 2548 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710111 2548 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710114 2548 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710117 2548 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710120 2548 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710123 2548 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710126 2548 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710129 2548 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710132 2548 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710134 2548 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710137 2548 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710140 2548 flags.go:64] FLAG: --feature-gates="" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710144 2548 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710147 2548 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710150 2548 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710153 2548 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710156 2548 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:57:32.713045 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710161 2548 flags.go:64] FLAG: --help="false" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710163 2548 flags.go:64] FLAG: --hostname-override="ip-10-0-139-10.ec2.internal" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710166 2548 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710169 2548 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710172 2548 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710175 2548 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710179 2548 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710182 2548 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710184 2548 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710188 2548 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710191 2548 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710194 2548 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710198 2548 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710200 2548 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710203 2548 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710206 2548 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710209 2548 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710211 2548 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710214 2548 flags.go:64] FLAG: --lock-file="" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710217 2548 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710219 2548 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710222 2548 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710228 2548 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:57:32.713688 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710230 2548 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710233 2548 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710236 2548 flags.go:64] FLAG: --logging-format="text" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710239 2548 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710242 2548 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710260 2548 flags.go:64] FLAG: --manifest-url="" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710263 2548 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710268 2548 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710271 2548 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710276 2548 flags.go:64] FLAG: --max-pods="110" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710279 2548 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710283 2548 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710285 2548 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710288 2548 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710291 2548 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710294 2548 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710297 2548 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710305 2548 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710310 2548 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710313 2548 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710316 2548 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710320 2548 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710326 2548 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710329 2548 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:57:32.714243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710332 2548 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710335 2548 flags.go:64] FLAG: --port="10250" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710339 2548 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710342 2548 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-095004ecdfe634b8e" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710345 2548 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710348 2548 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710351 2548 flags.go:64] FLAG: --register-node="true" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710353 2548 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710356 2548 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710360 2548 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710363 2548 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710366 2548 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710369 2548 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710372 2548 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710376 2548 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710379 2548 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710382 2548 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710385 2548 flags.go:64] FLAG: --runonce="false" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710388 2548 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710391 2548 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710395 2548 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710397 2548 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710400 2548 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710403 2548 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710406 2548 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710409 2548 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:57:32.714842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710413 2548 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710416 2548 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710419 2548 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710422 2548 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710425 2548 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710428 2548 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710431 2548 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710437 2548 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710440 2548 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710442 2548 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710447 2548 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710450 2548 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710453 2548 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710456 2548 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710459 2548 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710462 2548 flags.go:64] FLAG: --v="2" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710466 2548 flags.go:64] FLAG: --version="false" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710471 2548 flags.go:64] FLAG: --vmodule="" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710475 2548 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710478 2548 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710578 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710582 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710585 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710588 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:32.715460 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710592 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710595 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710598 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710602 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710606 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710609 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710612 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710615 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710617 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710620 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710622 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710625 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710628 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710631 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710633 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710636 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710638 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710641 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710643 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:32.716025 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710646 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710648 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710651 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710653 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710656 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710658 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710661 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710663 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710666 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710668 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710671 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710673 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710676 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710678 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710681 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710683 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710685 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710688 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710690 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710693 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:32.716574 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710695 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710698 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710700 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710703 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710705 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710707 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710710 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710714 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710716 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710719 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710721 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710724 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710726 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710729 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710731 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710734 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710736 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710738 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710741 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710743 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710745 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:32.717083 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710748 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710751 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710755 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710757 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710760 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710763 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710765 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710768 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710770 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710773 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710775 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710778 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710780 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710784 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710787 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710790 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710792 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710795 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710797 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:32.717633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710800 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:32.718300 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710803 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:32.718300 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.710805 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:32.718300 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.710811 2548 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:32.719483 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.719454 2548 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:57:32.719483 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.719480 2548 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719574 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719585 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719592 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719598 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719602 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719606 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719610 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719615 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719619 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719623 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719628 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719632 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719637 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719641 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719645 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719649 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719653 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719658 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:32.719654 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719662 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719667 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719672 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719676 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719690 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719694 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719699 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719703 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719707 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719714 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719720 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719725 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719730 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719735 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719739 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719743 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719747 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719751 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719755 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:32.720500 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719759 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719763 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719767 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719771 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719775 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719780 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719784 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719789 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719793 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719798 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719802 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719806 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719811 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719815 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719820 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719824 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719829 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719833 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719837 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719841 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:32.721097 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719846 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719850 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719854 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719859 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719863 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719867 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719871 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719875 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719879 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719883 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719888 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719892 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719896 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719901 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719905 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719909 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719913 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719918 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719922 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719936 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:32.721701 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719941 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719945 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719949 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719953 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719958 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719961 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719965 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719972 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.719976 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.719984 2548 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720146 2548 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720154 2548 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720160 2548 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720164 2548 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720169 2548 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:32.722499 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720173 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720177 2548 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720181 2548 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720185 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720190 2548 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720194 2548 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720198 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720202 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720206 2548 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720210 2548 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720214 2548 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720218 2548 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720222 2548 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720228 2548 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720236 2548 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720241 2548 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720265 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720279 2548 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720284 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:32.722912 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720288 2548 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720292 2548 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720296 2548 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720300 2548 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720304 2548 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720309 2548 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720313 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720318 2548 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720322 2548 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720326 2548 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720331 2548 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720334 2548 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720338 2548 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720342 2548 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720346 2548 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720351 2548 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720355 2548 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720359 2548 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720363 2548 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720367 2548 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:32.723443 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720372 2548 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720375 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720380 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720384 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720389 2548 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720393 2548 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720397 2548 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720401 2548 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720405 2548 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720409 2548 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720413 2548 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720426 2548 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720431 2548 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720435 2548 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720439 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720444 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720448 2548 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720452 2548 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720457 2548 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720461 2548 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:32.724143 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720466 2548 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720470 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720476 2548 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720481 2548 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720486 2548 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720490 2548 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720494 2548 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720498 2548 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720502 2548 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720506 2548 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720509 2548 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720513 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720518 2548 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720522 2548 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720526 2548 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720530 2548 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720536 2548 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720540 2548 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720544 2548 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720549 2548 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:32.724891 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720553 2548 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:32.725536 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:32.720557 2548 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:32.725536 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.720572 2548 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:32.725536 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.721303 2548 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:57:32.725536 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.725277 2548 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:57:32.726222 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.726208 2548 server.go:1019] "Starting client certificate rotation" Apr 22 19:57:32.726300 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.726281 2548 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:32.726346 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.726330 2548 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:32.746699 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.746668 2548 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:32.749614 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.749588 2548 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:32.759663 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.759640 2548 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:57:32.766602 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.766580 2548 log.go:25] "Validated CRI v1 image API" Apr 22 19:57:32.768071 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.768054 2548 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:57:32.774877 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.774843 2548 fs.go:135] Filesystem UUIDs: map[6af9d8b5-1ddf-4d4a-9c54-d25f84de5fc6:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9075b812-b039-4e72-b32e-e2be0a616f6d:/dev/nvme0n1p3] Apr 22 19:57:32.774991 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.774877 2548 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:57:32.777895 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.777872 2548 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:32.782014 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.781886 2548 manager.go:217] Machine: {Timestamp:2026-04-22 19:57:32.77955428 +0000 UTC m=+0.368045603 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107743 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29ce9d1f6079410bcdbb5dafb26017 SystemUUID:ec29ce9d-1f60-7941-0bcd-bb5dafb26017 BootID:294db986-4a54-4b34-a836-04d6a8a54405 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:02:96:87:31:91 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:02:96:87:31:91 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:84:d3:08:7f:90 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:57:32.782014 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.782002 2548 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:57:32.782200 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.782182 2548 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:57:32.783194 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.783168 2548 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:57:32.783380 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.783196 2548 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-10.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:57:32.783458 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.783395 2548 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:57:32.783458 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.783408 2548 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:57:32.783458 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.783427 2548 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:32.784435 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.784422 2548 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:32.785917 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.785904 2548 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:32.786054 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.786043 2548 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:57:32.788281 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.788270 2548 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:57:32.788338 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.788289 2548 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:57:32.788338 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.788306 2548 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:57:32.788338 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.788319 2548 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:57:32.788338 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.788332 2548 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:57:32.789481 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.789467 2548 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:32.789546 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.789492 2548 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:32.792203 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.792186 2548 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:57:32.794069 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.794052 2548 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:57:32.795451 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795435 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:57:32.795552 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795464 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:57:32.795552 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795478 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:57:32.795552 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795492 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:57:32.795552 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795503 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:57:32.795552 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795513 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:57:32.795552 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795521 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:57:32.795552 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795533 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:57:32.795552 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795548 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:57:32.795818 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795560 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:57:32.795818 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795595 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:57:32.795818 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795615 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:57:32.795925 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.795903 2548 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mz4q7" Apr 22 19:57:32.796395 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.796383 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:57:32.796453 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.796399 2548 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:57:32.799768 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.799738 2548 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-10.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:57:32.799857 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.799791 2548 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:57:32.799857 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.799786 2548 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-10.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:57:32.800144 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.800131 2548 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:57:32.800264 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.800240 2548 server.go:1295] "Started kubelet" Apr 22 19:57:32.800421 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.800396 2548 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:57:32.800474 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.800404 2548 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:57:32.800521 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.800484 2548 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:57:32.801240 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.801204 2548 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mz4q7" Apr 22 19:57:32.801263 ip-10-0-139-10 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:57:32.802113 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.802098 2548 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:57:32.802573 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.802561 2548 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:57:32.811022 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.810834 2548 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:57:32.811768 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.811753 2548 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:32.812438 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.812423 2548 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:57:32.814503 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.814352 2548 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:57:32.814598 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.814507 2548 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:57:32.814598 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.814543 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:32.814735 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.814715 2548 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:57:32.814782 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.814737 2548 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:57:32.814875 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.814861 2548 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:57:32.815609 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.815588 2548 factory.go:55] Registering systemd factory Apr 22 19:57:32.815724 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.815712 2548 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:57:32.816003 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.815989 2548 factory.go:153] Registering CRI-O factory Apr 22 19:57:32.816003 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.816004 2548 factory.go:223] Registration of the crio container factory successfully Apr 22 19:57:32.816117 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.816095 2548 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:57:32.816163 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.816120 2548 factory.go:103] Registering Raw factory Apr 22 19:57:32.816163 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.816142 2548 manager.go:1196] Started watching for new ooms in manager Apr 22 19:57:32.816814 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.816788 2548 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:32.817012 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.816998 2548 manager.go:319] Starting recovery of all containers Apr 22 19:57:32.818140 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.818117 2548 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-10.ec2.internal\" not found" node="ip-10-0-139-10.ec2.internal" Apr 22 19:57:32.828520 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.828504 2548 manager.go:324] Recovery completed Apr 22 19:57:32.832464 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.832451 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:32.835003 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.834983 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:32.835073 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.835014 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:32.835073 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.835028 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:32.835542 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.835528 2548 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:57:32.835595 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.835542 2548 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:57:32.835595 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.835560 2548 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:32.839901 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.839885 2548 policy_none.go:49] "None policy: Start" Apr 22 19:57:32.839963 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.839904 2548 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:57:32.839963 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.839915 2548 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:57:32.881905 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.881887 2548 manager.go:341] "Starting Device Plugin manager" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.881930 2548 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.881942 2548 server.go:85] "Starting device plugin registration server" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.882225 2548 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.882242 2548 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.882365 2548 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.882452 2548 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.882461 2548 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.882958 2548 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:57:32.897300 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.882997 2548 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:32.945716 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.945674 2548 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:57:32.947077 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.947054 2548 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:57:32.947077 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.947079 2548 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:57:32.947232 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.947101 2548 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:57:32.947232 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.947107 2548 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:57:32.947232 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.947149 2548 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:57:32.950472 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.950396 2548 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:32.983177 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.983150 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:32.984382 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.984361 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:32.984498 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.984397 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:32.984498 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.984412 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:32.984498 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.984435 2548 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-10.ec2.internal" Apr 22 19:57:32.992339 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:32.992316 2548 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-10.ec2.internal" Apr 22 19:57:32.992339 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:32.992340 2548 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-10.ec2.internal\": node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.008639 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.008614 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.047774 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.047731 2548 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal"] Apr 22 19:57:33.047897 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.047840 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:33.048803 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.048786 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:33.048879 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.048821 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:33.048879 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.048832 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:33.050778 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.050765 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:33.050901 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.050885 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.050946 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.050932 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:33.051562 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.051546 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:33.051635 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.051576 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:33.051635 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.051547 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:33.051635 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.051610 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:33.051635 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.051623 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:33.051635 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.051587 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:33.053543 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.053529 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.053592 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.053556 2548 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:33.054215 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.054192 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:33.054317 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.054231 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:33.054317 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.054262 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:33.076883 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.076863 2548 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-10.ec2.internal\" not found" node="ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.081119 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.081104 2548 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-10.ec2.internal\" not found" node="ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.109088 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.109062 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.116096 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.116077 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.116164 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.116103 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.116164 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.116119 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baab329e26e3fec548046283f03a6805-config\") pod \"kube-apiserver-proxy-ip-10-0-139-10.ec2.internal\" (UID: \"baab329e26e3fec548046283f03a6805\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.210236 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.210160 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.216586 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.216561 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baab329e26e3fec548046283f03a6805-config\") pod \"kube-apiserver-proxy-ip-10-0-139-10.ec2.internal\" (UID: \"baab329e26e3fec548046283f03a6805\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.216639 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.216592 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.216639 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.216621 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.216711 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.216684 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.216764 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.216685 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baab329e26e3fec548046283f03a6805-config\") pod \"kube-apiserver-proxy-ip-10-0-139-10.ec2.internal\" (UID: \"baab329e26e3fec548046283f03a6805\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.216764 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.216714 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.311048 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.311005 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.379669 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.379634 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.383554 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.383325 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.411258 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.411215 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.511904 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.511874 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.612477 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.612441 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.713171 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.713139 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.726590 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.726556 2548 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:57:33.726746 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.726724 2548 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:33.726804 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.726752 2548 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:33.804399 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.804356 2548 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:52:32 +0000 UTC" deadline="2027-11-03 07:58:52.305044015 +0000 UTC" Apr 22 19:57:33.804399 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.804393 2548 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13428h1m18.500655002s" Apr 22 19:57:33.812440 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.812392 2548 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:33.813233 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:33.813212 2548 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 19:57:33.824669 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.824638 2548 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:33.839302 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.839275 2548 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-g7x52" Apr 22 19:57:33.846648 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.846617 2548 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-g7x52" Apr 22 19:57:33.874804 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.874773 2548 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:33.912934 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.912906 2548 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.920846 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.920795 2548 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:33.922134 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.922121 2548 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 19:57:33.926024 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:33.925994 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a85548356ac6e6ad9bedf610076abee.slice/crio-40ae3b0b5f10960c0e2f9f6dbcd0f9a45168b8669e9c2ca52bd5fb7214b51722 WatchSource:0}: Error finding container 40ae3b0b5f10960c0e2f9f6dbcd0f9a45168b8669e9c2ca52bd5fb7214b51722: Status 404 returned error can't find the container with id 40ae3b0b5f10960c0e2f9f6dbcd0f9a45168b8669e9c2ca52bd5fb7214b51722 Apr 22 19:57:33.926327 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:33.926309 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaab329e26e3fec548046283f03a6805.slice/crio-a3ce8180565373c015742113b41e3c7b1fa4cd016983b1ab8077ff24493ac2a9 WatchSource:0}: Error finding container a3ce8180565373c015742113b41e3c7b1fa4cd016983b1ab8077ff24493ac2a9: Status 404 returned error can't find the container with id a3ce8180565373c015742113b41e3c7b1fa4cd016983b1ab8077ff24493ac2a9 Apr 22 19:57:33.930726 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.930711 2548 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:33.930876 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.930860 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:57:33.950122 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.950055 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" event={"ID":"9a85548356ac6e6ad9bedf610076abee","Type":"ContainerStarted","Data":"40ae3b0b5f10960c0e2f9f6dbcd0f9a45168b8669e9c2ca52bd5fb7214b51722"} Apr 22 19:57:33.950859 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:33.950842 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" event={"ID":"baab329e26e3fec548046283f03a6805","Type":"ContainerStarted","Data":"a3ce8180565373c015742113b41e3c7b1fa4cd016983b1ab8077ff24493ac2a9"} Apr 22 19:57:34.331020 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.330989 2548 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:34.665060 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.664964 2548 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:34.789649 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.789615 2548 apiserver.go:52] "Watching apiserver" Apr 22 19:57:34.796645 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.796619 2548 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:57:34.798577 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.798549 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vqlnc","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal","openshift-network-diagnostics/network-check-target-pk9ff","openshift-ovn-kubernetes/ovnkube-node-cqc2r","kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv","openshift-multus/multus-additional-cni-plugins-lq662","openshift-multus/multus-zkw8l","openshift-multus/network-metrics-daemon-jjztz","openshift-network-operator/iptables-alerter-5xnhn","kube-system/konnectivity-agent-r6qd8","openshift-cluster-node-tuning-operator/tuned-5hz9q"] Apr 22 19:57:34.801150 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.801126 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.803265 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.803226 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:34.803618 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.803598 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:34.803769 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.803633 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:57:34.803910 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.803892 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wnd2q\"" Apr 22 19:57:34.804017 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.803940 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:34.808325 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.805870 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:57:34.808325 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.806118 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-p42gv\"" Apr 22 19:57:34.808325 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.806523 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:57:34.809328 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.809310 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.809424 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.809364 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:34.809772 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:34.809432 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:34.811995 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.811680 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.811995 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.811695 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-smzgt\"" Apr 22 19:57:34.811995 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.811812 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:57:34.811995 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.811878 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:57:34.812235 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.812223 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:57:34.813915 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.813895 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.814463 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.814427 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:57:34.814567 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.814431 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:57:34.814567 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.814482 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:57:34.814676 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.814491 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:57:34.815169 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.815149 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rshfs\"" Apr 22 19:57:34.815288 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.815174 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:57:34.815469 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.815453 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:57:34.816301 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.816282 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.817148 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.816902 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:57:34.817148 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.817035 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-k7gmp\"" Apr 22 19:57:34.817148 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.817035 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:57:34.817148 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.817139 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:57:34.818620 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.818602 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:57:34.818854 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.818837 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:57:34.818938 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.818877 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wvvk9\"" Apr 22 19:57:34.819023 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.818611 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.819081 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.819023 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:57:34.819326 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.819206 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:57:34.819326 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.819206 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:57:34.821054 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.821032 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:34.821150 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:34.821112 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:34.821599 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.821579 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:57:34.821710 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.821693 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rrllm\"" Apr 22 19:57:34.823490 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.823471 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.824377 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824358 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-socket-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.824485 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824390 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41898967-cc03-4d1a-a021-cc3f7817d848-host-slash\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.824485 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824416 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-systemd-units\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824485 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824440 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-ovn\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824485 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824465 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.824676 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824491 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d4bf3cd-4394-40c1-a8fc-3a9c169a083c-konnectivity-ca\") pod \"konnectivity-agent-r6qd8\" (UID: \"9d4bf3cd-4394-40c1-a8fc-3a9c169a083c\") " pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:34.824676 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824513 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-node-log\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824676 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824536 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovnkube-config\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824676 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824600 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznwb\" (UniqueName: \"kubernetes.io/projected/3a7bdf57-222a-4e36-b827-d320c2eaaac4-kube-api-access-sznwb\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824676 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824636 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-cnibin\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.824676 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824670 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824696 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d4bf3cd-4394-40c1-a8fc-3a9c169a083c-agent-certs\") pod \"konnectivity-agent-r6qd8\" (UID: \"9d4bf3cd-4394-40c1-a8fc-3a9c169a083c\") " pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824722 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824745 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-systemd\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824770 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824812 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824857 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-env-overrides\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824886 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovn-node-metrics-cert\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824913 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovnkube-script-lib\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824938 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-kubelet\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.824997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.824961 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-run-netns\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825001 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-etc-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825036 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-log-socket\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825065 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825092 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-cni-bin\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825116 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-cni-netd\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825140 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-etc-selinux\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825186 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/41898967-cc03-4d1a-a021-cc3f7817d848-iptables-alerter-script\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825283 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfbb3072-c0b3-48da-8291-55700270a1f3-host\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825315 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dfbb3072-c0b3-48da-8291-55700270a1f3-serviceca\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825341 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-var-lib-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.825446 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825405 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-sys-fs\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825436 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5lb4\" (UniqueName: \"kubernetes.io/projected/dfbb3072-c0b3-48da-8291-55700270a1f3-kube-api-access-d5lb4\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825480 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-device-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825496 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqcfh\" (UniqueName: \"kubernetes.io/projected/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-kube-api-access-pqcfh\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825533 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-system-cni-dir\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825557 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-os-release\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825585 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825610 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md4rm\" (UniqueName: \"kubernetes.io/projected/8071e1d3-8155-4265-9de0-c92543778149-kube-api-access-md4rm\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825677 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx49d\" (UniqueName: \"kubernetes.io/projected/41898967-cc03-4d1a-a021-cc3f7817d848-kube-api-access-xx49d\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825717 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-slash\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825743 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-registration-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825768 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-cni-binary-copy\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.825882 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825804 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.826270 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.825946 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:34.826270 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.826034 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:34.826270 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.826088 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vmj5p\"" Apr 22 19:57:34.847587 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.847560 2548 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:33 +0000 UTC" deadline="2027-10-26 03:21:17.973502675 +0000 UTC" Apr 22 19:57:34.847682 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.847586 2548 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13231h23m43.1259193s" Apr 22 19:57:34.915729 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.915651 2548 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:57:34.926440 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926400 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-etc-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.926603 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926446 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-log-socket\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.926603 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926469 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.926603 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926536 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-log-socket\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.926603 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926534 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-cni-bin\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.926603 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926579 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-cni-bin\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.926603 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926597 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-cni-binary-copy\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926621 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-multus-certs\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926627 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-etc-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926643 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-run\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926663 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-sys\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926686 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926691 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfbb3072-c0b3-48da-8291-55700270a1f3-host\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926727 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfbb3072-c0b3-48da-8291-55700270a1f3-host\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926745 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-hostroot\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926786 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqfls\" (UniqueName: \"kubernetes.io/projected/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-kube-api-access-cqfls\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926818 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-device-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926843 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-modprobe-d\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.926861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926867 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-systemd\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926892 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-tuned\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926915 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-tmp\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926920 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-device-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926942 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx49d\" (UniqueName: \"kubernetes.io/projected/41898967-cc03-4d1a-a021-cc3f7817d848-kube-api-access-xx49d\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926975 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-slash\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.926997 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-cni-binary-copy\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927013 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927038 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-cni-multus\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927061 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-kubernetes\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927075 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysctl-d\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927106 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8mj\" (UniqueName: \"kubernetes.io/projected/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-kube-api-access-zq8mj\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927133 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927149 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-system-cni-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927176 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-cni-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927196 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d4bf3cd-4394-40c1-a8fc-3a9c169a083c-konnectivity-ca\") pod \"konnectivity-agent-r6qd8\" (UID: \"9d4bf3cd-4394-40c1-a8fc-3a9c169a083c\") " pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927044 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-slash\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927303 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927230 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-node-log\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927298 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovnkube-config\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927324 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927303 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927393 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysconfig\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927419 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-systemd\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927445 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927493 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-env-overrides\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927520 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovn-node-metrics-cert\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927546 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovnkube-script-lib\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927571 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-etc-selinux\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927614 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-conf-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927633 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-node-log\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927676 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-kubelet\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927697 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-run-netns\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927730 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-cni-netd\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927746 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-sys-fs\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.927842 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927754 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927772 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927780 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9d4bf3cd-4394-40c1-a8fc-3a9c169a083c-konnectivity-ca\") pod \"konnectivity-agent-r6qd8\" (UID: \"9d4bf3cd-4394-40c1-a8fc-3a9c169a083c\") " pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927793 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-cni-binary-copy\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927788 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-host\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927882 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-kubelet\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927922 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927928 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-cni-netd\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.927996 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-sys-fs\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928051 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928225 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-env-overrides\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928272 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-systemd\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928281 2548 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928307 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-run-netns\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928332 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/41898967-cc03-4d1a-a021-cc3f7817d848-iptables-alerter-script\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928380 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dfbb3072-c0b3-48da-8291-55700270a1f3-serviceca\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928407 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-var-lib-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928421 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-etc-selinux\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.928644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928421 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovnkube-config\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928453 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-k8s-cni-cncf-io\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928482 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-var-lib-openvswitch\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928506 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5lb4\" (UniqueName: \"kubernetes.io/projected/dfbb3072-c0b3-48da-8291-55700270a1f3-kube-api-access-d5lb4\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928539 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-cni-bin\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928615 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-daemon-config\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928677 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztrpb\" (UniqueName: \"kubernetes.io/projected/450a901e-1810-4879-8bc6-97efb2b1c9d9-kube-api-access-ztrpb\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928719 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysctl-conf\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928769 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-registration-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928805 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-cnibin\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928818 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-registration-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928854 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-socket-dir-parent\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928862 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dfbb3072-c0b3-48da-8291-55700270a1f3-serviceca\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928883 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-etc-kubernetes\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928919 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/41898967-cc03-4d1a-a021-cc3f7817d848-iptables-alerter-script\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928945 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-socket-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.928997 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-os-release\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.929549 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929021 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-kubelet\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929043 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-socket-dir\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929048 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-var-lib-kubelet\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929078 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41898967-cc03-4d1a-a021-cc3f7817d848-host-slash\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929105 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-systemd-units\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929130 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-ovn\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929151 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-systemd-units\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929156 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sznwb\" (UniqueName: \"kubernetes.io/projected/3a7bdf57-222a-4e36-b827-d320c2eaaac4-kube-api-access-sznwb\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929158 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41898967-cc03-4d1a-a021-cc3f7817d848-host-slash\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929187 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-cnibin\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929202 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-run-ovn\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929229 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovnkube-script-lib\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929232 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-netns\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929266 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-cnibin\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929292 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-lib-modules\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929320 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d4bf3cd-4394-40c1-a8fc-3a9c169a083c-agent-certs\") pod \"konnectivity-agent-r6qd8\" (UID: \"9d4bf3cd-4394-40c1-a8fc-3a9c169a083c\") " pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929344 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:34.930515 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929368 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929396 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqcfh\" (UniqueName: \"kubernetes.io/projected/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-kube-api-access-pqcfh\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929425 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-system-cni-dir\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929451 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-os-release\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929475 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929473 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a7bdf57-222a-4e36-b827-d320c2eaaac4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929510 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md4rm\" (UniqueName: \"kubernetes.io/projected/8071e1d3-8155-4265-9de0-c92543778149-kube-api-access-md4rm\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929573 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-system-cni-dir\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.929642 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8071e1d3-8155-4265-9de0-c92543778149-os-release\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.931294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.930036 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8071e1d3-8155-4265-9de0-c92543778149-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:34.932403 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.932384 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a7bdf57-222a-4e36-b827-d320c2eaaac4-ovn-node-metrics-cert\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.932594 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.932575 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9d4bf3cd-4394-40c1-a8fc-3a9c169a083c-agent-certs\") pod \"konnectivity-agent-r6qd8\" (UID: \"9d4bf3cd-4394-40c1-a8fc-3a9c169a083c\") " pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:34.934235 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:34.934215 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:34.934500 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:34.934239 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:34.934500 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:34.934279 2548 projected.go:194] Error preparing data for projected volume kube-api-access-kwgfj for pod openshift-network-diagnostics/network-check-target-pk9ff: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:34.934500 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:34.934367 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj podName:2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf nodeName:}" failed. No retries permitted until 2026-04-22 19:57:35.434321603 +0000 UTC m=+3.022812923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kwgfj" (UniqueName: "kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj") pod "network-check-target-pk9ff" (UID: "2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:34.934759 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.934424 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx49d\" (UniqueName: \"kubernetes.io/projected/41898967-cc03-4d1a-a021-cc3f7817d848-kube-api-access-xx49d\") pod \"iptables-alerter-5xnhn\" (UID: \"41898967-cc03-4d1a-a021-cc3f7817d848\") " pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:34.935820 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.935791 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5lb4\" (UniqueName: \"kubernetes.io/projected/dfbb3072-c0b3-48da-8291-55700270a1f3-kube-api-access-d5lb4\") pod \"node-ca-vqlnc\" (UID: \"dfbb3072-c0b3-48da-8291-55700270a1f3\") " pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:34.936532 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.936513 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sznwb\" (UniqueName: \"kubernetes.io/projected/3a7bdf57-222a-4e36-b827-d320c2eaaac4-kube-api-access-sznwb\") pod \"ovnkube-node-cqc2r\" (UID: \"3a7bdf57-222a-4e36-b827-d320c2eaaac4\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:34.939556 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.939529 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqcfh\" (UniqueName: \"kubernetes.io/projected/2c69cc19-8d6a-4316-b289-a47a9ce15bd3-kube-api-access-pqcfh\") pod \"aws-ebs-csi-driver-node-9vshv\" (UID: \"2c69cc19-8d6a-4316-b289-a47a9ce15bd3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:34.941214 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:34.941187 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md4rm\" (UniqueName: \"kubernetes.io/projected/8071e1d3-8155-4265-9de0-c92543778149-kube-api-access-md4rm\") pod \"multus-additional-cni-plugins-lq662\" (UID: \"8071e1d3-8155-4265-9de0-c92543778149\") " pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:35.030927 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.030883 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-cni-bin\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.030941 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-daemon-config\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.030965 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrpb\" (UniqueName: \"kubernetes.io/projected/450a901e-1810-4879-8bc6-97efb2b1c9d9-kube-api-access-ztrpb\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.030994 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysctl-conf\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031018 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-cnibin\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031034 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-socket-dir-parent\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031050 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-etc-kubernetes\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031065 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-os-release\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031087 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-kubelet\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031120 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031109 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-var-lib-kubelet\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031138 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-netns\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031161 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-lib-modules\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031210 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-cni-binary-copy\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031228 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-multus-certs\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031268 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-run\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031289 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-sys\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031309 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-hostroot\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031334 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqfls\" (UniqueName: \"kubernetes.io/projected/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-kube-api-access-cqfls\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031360 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-modprobe-d\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031381 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-systemd\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031407 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-tuned\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031427 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-tmp\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031470 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-cni-multus\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031491 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-kubernetes\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.031513 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031511 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysctl-d\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031539 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8mj\" (UniqueName: \"kubernetes.io/projected/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-kube-api-access-zq8mj\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031560 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-system-cni-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031595 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-cni-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031629 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysconfig\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031652 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-conf-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031669 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031687 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-host\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031714 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-k8s-cni-cncf-io\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031778 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-sys\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031787 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-k8s-cni-cncf-io\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031818 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-hostroot\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032115 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.031847 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-cni-bin\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032597 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032236 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-modprobe-d\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032597 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032330 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-systemd\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032597 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032443 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-daemon-config\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032749 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032718 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-kubelet\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032794 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032771 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-var-lib-kubelet\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032846 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032804 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysctl-conf\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032899 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032844 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-netns\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.032899 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032865 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-lib-modules\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.032899 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032890 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-cnibin\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033029 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032926 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-etc-kubernetes\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033029 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032908 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-system-cni-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033029 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.032988 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-os-release\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033153 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033064 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-run-multus-certs\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033153 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033105 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-conf-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033179 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-cni-dir\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033243 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033229 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysconfig\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.033356 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033298 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-run\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.033356 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033316 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-kubernetes\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.033356 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033342 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-cni-binary-copy\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033480 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033385 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-host-var-lib-cni-multus\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033480 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033388 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-multus-socket-dir-parent\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.033480 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033411 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-sysctl-d\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.033578 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:35.033488 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:35.033578 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:35.033564 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:35.533545421 +0000 UTC m=+3.122036757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:35.033699 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.033675 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-host\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.035986 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.035963 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-tmp\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.036089 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.036019 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-etc-tuned\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.040264 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.040224 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztrpb\" (UniqueName: \"kubernetes.io/projected/450a901e-1810-4879-8bc6-97efb2b1c9d9-kube-api-access-ztrpb\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:35.040709 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.040684 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqfls\" (UniqueName: \"kubernetes.io/projected/0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76-kube-api-access-cqfls\") pod \"tuned-5hz9q\" (UID: \"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76\") " pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.043656 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.043630 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8mj\" (UniqueName: \"kubernetes.io/projected/ef5f98dc-99df-42ee-b6ba-f81c8f509e56-kube-api-access-zq8mj\") pod \"multus-zkw8l\" (UID: \"ef5f98dc-99df-42ee-b6ba-f81c8f509e56\") " pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.114548 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.114511 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5xnhn" Apr 22 19:57:35.121377 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.121351 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:35.129939 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.129916 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vqlnc" Apr 22 19:57:35.134693 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.134669 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:35.141742 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.141720 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" Apr 22 19:57:35.150405 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.150387 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lq662" Apr 22 19:57:35.157994 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.157975 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zkw8l" Apr 22 19:57:35.163607 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.163587 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" Apr 22 19:57:35.319567 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.319531 2548 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:35.435195 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.435162 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:35.435401 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:35.435380 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:35.435468 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:35.435405 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:35.435468 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:35.435418 2548 projected.go:194] Error preparing data for projected volume kube-api-access-kwgfj for pod openshift-network-diagnostics/network-check-target-pk9ff: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:35.435550 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:35.435476 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj podName:2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf nodeName:}" failed. No retries permitted until 2026-04-22 19:57:36.435461719 +0000 UTC m=+4.023953025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kwgfj" (UniqueName: "kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj") pod "network-check-target-pk9ff" (UID: "2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:35.536182 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.536148 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:35.536376 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:35.536297 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:35.536432 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:35.536377 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:36.536351203 +0000 UTC m=+4.124842543 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:35.621072 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:35.621040 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfbb3072_c0b3_48da_8291_55700270a1f3.slice/crio-7d781c5c6ac4e93b59c3bce5edc3adff36ade2d72480568f90d75820e5a068dc WatchSource:0}: Error finding container 7d781c5c6ac4e93b59c3bce5edc3adff36ade2d72480568f90d75820e5a068dc: Status 404 returned error can't find the container with id 7d781c5c6ac4e93b59c3bce5edc3adff36ade2d72480568f90d75820e5a068dc Apr 22 19:57:35.622292 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:35.622264 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef5f98dc_99df_42ee_b6ba_f81c8f509e56.slice/crio-189a898cce0aab54965bc9395443bad167a3045b2f18f7fc862dbd9020152b79 WatchSource:0}: Error finding container 189a898cce0aab54965bc9395443bad167a3045b2f18f7fc862dbd9020152b79: Status 404 returned error can't find the container with id 189a898cce0aab54965bc9395443bad167a3045b2f18f7fc862dbd9020152b79 Apr 22 19:57:35.625633 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:35.624764 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6577ea_fdb6_4a04_8c7a_6cee5fdfcc76.slice/crio-818caa39aaf31acad124152bdbf9dbdea5b4a3b13358af4f16981bd029ff4258 WatchSource:0}: Error finding container 818caa39aaf31acad124152bdbf9dbdea5b4a3b13358af4f16981bd029ff4258: Status 404 returned error can't find the container with id 818caa39aaf31acad124152bdbf9dbdea5b4a3b13358af4f16981bd029ff4258 Apr 22 19:57:35.626612 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:35.626585 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4bf3cd_4394_40c1_a8fc_3a9c169a083c.slice/crio-e3fc73389a1102ff0d8bac96677195a0507e356192bab2d724e742b19aa11a81 WatchSource:0}: Error finding container e3fc73389a1102ff0d8bac96677195a0507e356192bab2d724e742b19aa11a81: Status 404 returned error can't find the container with id e3fc73389a1102ff0d8bac96677195a0507e356192bab2d724e742b19aa11a81 Apr 22 19:57:35.627433 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:35.627407 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8071e1d3_8155_4265_9de0_c92543778149.slice/crio-2b16d08a1a3b1554ec1dd8fc9ec5ca2d1f749cc9f8ddb93062a1156fa26a21fb WatchSource:0}: Error finding container 2b16d08a1a3b1554ec1dd8fc9ec5ca2d1f749cc9f8ddb93062a1156fa26a21fb: Status 404 returned error can't find the container with id 2b16d08a1a3b1554ec1dd8fc9ec5ca2d1f749cc9f8ddb93062a1156fa26a21fb Apr 22 19:57:35.629154 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:35.629133 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a7bdf57_222a_4e36_b827_d320c2eaaac4.slice/crio-2d691da8ee8db5398035862b75f0a2bcff6a663f64251169d76646f4ea69ccd6 WatchSource:0}: Error finding container 2d691da8ee8db5398035862b75f0a2bcff6a663f64251169d76646f4ea69ccd6: Status 404 returned error can't find the container with id 2d691da8ee8db5398035862b75f0a2bcff6a663f64251169d76646f4ea69ccd6 Apr 22 19:57:35.651981 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:35.651949 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41898967_cc03_4d1a_a021_cc3f7817d848.slice/crio-55d1f5bcac3f05799c2ce958b60fa98e3b3b233409f03ddc71fdceb4a844d09b WatchSource:0}: Error finding container 55d1f5bcac3f05799c2ce958b60fa98e3b3b233409f03ddc71fdceb4a844d09b: Status 404 returned error can't find the container with id 55d1f5bcac3f05799c2ce958b60fa98e3b3b233409f03ddc71fdceb4a844d09b Apr 22 19:57:35.652619 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:35.652594 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c69cc19_8d6a_4316_b289_a47a9ce15bd3.slice/crio-f0be288073dafd619746704eb69fe1ce5de67e3fe3f69785d526e525687c68e0 WatchSource:0}: Error finding container f0be288073dafd619746704eb69fe1ce5de67e3fe3f69785d526e525687c68e0: Status 404 returned error can't find the container with id f0be288073dafd619746704eb69fe1ce5de67e3fe3f69785d526e525687c68e0 Apr 22 19:57:35.847993 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.847784 2548 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:33 +0000 UTC" deadline="2028-02-03 13:23:36.355147678 +0000 UTC" Apr 22 19:57:35.847993 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.847990 2548 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15641h26m0.507161739s" Apr 22 19:57:35.954571 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.954449 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerStarted","Data":"2b16d08a1a3b1554ec1dd8fc9ec5ca2d1f749cc9f8ddb93062a1156fa26a21fb"} Apr 22 19:57:35.957679 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.957633 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" event={"ID":"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76","Type":"ContainerStarted","Data":"818caa39aaf31acad124152bdbf9dbdea5b4a3b13358af4f16981bd029ff4258"} Apr 22 19:57:35.959412 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.959382 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" event={"ID":"2c69cc19-8d6a-4316-b289-a47a9ce15bd3","Type":"ContainerStarted","Data":"f0be288073dafd619746704eb69fe1ce5de67e3fe3f69785d526e525687c68e0"} Apr 22 19:57:35.960744 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.960716 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5xnhn" event={"ID":"41898967-cc03-4d1a-a021-cc3f7817d848","Type":"ContainerStarted","Data":"55d1f5bcac3f05799c2ce958b60fa98e3b3b233409f03ddc71fdceb4a844d09b"} Apr 22 19:57:35.961741 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.961706 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"2d691da8ee8db5398035862b75f0a2bcff6a663f64251169d76646f4ea69ccd6"} Apr 22 19:57:35.962707 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.962686 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r6qd8" event={"ID":"9d4bf3cd-4394-40c1-a8fc-3a9c169a083c","Type":"ContainerStarted","Data":"e3fc73389a1102ff0d8bac96677195a0507e356192bab2d724e742b19aa11a81"} Apr 22 19:57:35.963650 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.963624 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zkw8l" event={"ID":"ef5f98dc-99df-42ee-b6ba-f81c8f509e56","Type":"ContainerStarted","Data":"189a898cce0aab54965bc9395443bad167a3045b2f18f7fc862dbd9020152b79"} Apr 22 19:57:35.964532 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.964507 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vqlnc" event={"ID":"dfbb3072-c0b3-48da-8291-55700270a1f3","Type":"ContainerStarted","Data":"7d781c5c6ac4e93b59c3bce5edc3adff36ade2d72480568f90d75820e5a068dc"} Apr 22 19:57:35.966432 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.966411 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" event={"ID":"baab329e26e3fec548046283f03a6805","Type":"ContainerStarted","Data":"22e8c918115b4302a81e7a6f9c0892658250d07de56c6badb9960b3941c8afd3"} Apr 22 19:57:35.978336 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:35.978288 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" podStartSLOduration=2.978273426 podStartE2EDuration="2.978273426s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:35.977824457 +0000 UTC m=+3.566315786" watchObservedRunningTime="2026-04-22 19:57:35.978273426 +0000 UTC m=+3.566764750" Apr 22 19:57:36.444231 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:36.443388 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:36.444231 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:36.443549 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:36.444231 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:36.443627 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:36.444231 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:36.443704 2548 projected.go:194] Error preparing data for projected volume kube-api-access-kwgfj for pod openshift-network-diagnostics/network-check-target-pk9ff: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:36.444231 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:36.443863 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj podName:2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf nodeName:}" failed. No retries permitted until 2026-04-22 19:57:38.44378107 +0000 UTC m=+6.032272391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kwgfj" (UniqueName: "kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj") pod "network-check-target-pk9ff" (UID: "2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:36.544582 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:36.544535 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:36.544769 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:36.544751 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:36.544833 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:36.544817 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:38.544798702 +0000 UTC m=+6.133290024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:36.948416 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:36.947789 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:36.948416 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:36.947941 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:36.948416 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:36.948336 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:36.949087 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:36.948455 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:36.972344 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:36.971924 2548 generic.go:358] "Generic (PLEG): container finished" podID="9a85548356ac6e6ad9bedf610076abee" containerID="9f3e1f5ee437643dd4633bc399b3e7c615f77cf9b675aed0820a681698614b85" exitCode=0 Apr 22 19:57:36.972344 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:36.972296 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" event={"ID":"9a85548356ac6e6ad9bedf610076abee","Type":"ContainerDied","Data":"9f3e1f5ee437643dd4633bc399b3e7c615f77cf9b675aed0820a681698614b85"} Apr 22 19:57:37.985524 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:37.985484 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" event={"ID":"9a85548356ac6e6ad9bedf610076abee","Type":"ContainerStarted","Data":"9268b8de34725c8658293f6ee49462598471d6086615cc390f33043fdf366734"} Apr 22 19:57:38.463738 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:38.463646 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:38.463931 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:38.463819 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:38.463931 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:38.463845 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:38.463931 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:38.463859 2548 projected.go:194] Error preparing data for projected volume kube-api-access-kwgfj for pod openshift-network-diagnostics/network-check-target-pk9ff: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:38.463931 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:38.463928 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj podName:2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf nodeName:}" failed. No retries permitted until 2026-04-22 19:57:42.463909723 +0000 UTC m=+10.052401048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kwgfj" (UniqueName: "kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj") pod "network-check-target-pk9ff" (UID: "2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:38.565033 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:38.564527 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:38.565033 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:38.564720 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:38.565033 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:38.564771 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:42.564757639 +0000 UTC m=+10.153248946 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:38.947902 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:38.947386 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:38.947902 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:38.947497 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:38.947902 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:38.947801 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:38.947902 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:38.947872 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:40.947912 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:40.947478 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:40.947912 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:40.947518 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:40.947912 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:40.947601 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:40.947912 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:40.947746 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:42.265897 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.265840 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" podStartSLOduration=9.265818995 podStartE2EDuration="9.265818995s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:37.999044975 +0000 UTC m=+5.587536305" watchObservedRunningTime="2026-04-22 19:57:42.265818995 +0000 UTC m=+9.854310316" Apr 22 19:57:42.266772 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.266750 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tscxv"] Apr 22 19:57:42.270792 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.270769 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.273493 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.273466 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7782d\"" Apr 22 19:57:42.274183 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.274167 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:57:42.274312 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.274294 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:57:42.300519 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.300490 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e37b1adb-be11-4c0a-beea-dbf70b8cda38-hosts-file\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.300677 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.300563 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvzd\" (UniqueName: \"kubernetes.io/projected/e37b1adb-be11-4c0a-beea-dbf70b8cda38-kube-api-access-6nvzd\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.300677 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.300600 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e37b1adb-be11-4c0a-beea-dbf70b8cda38-tmp-dir\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.401180 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.401141 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e37b1adb-be11-4c0a-beea-dbf70b8cda38-tmp-dir\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.401384 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.401219 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e37b1adb-be11-4c0a-beea-dbf70b8cda38-hosts-file\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.401384 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.401315 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvzd\" (UniqueName: \"kubernetes.io/projected/e37b1adb-be11-4c0a-beea-dbf70b8cda38-kube-api-access-6nvzd\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.402042 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.402010 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e37b1adb-be11-4c0a-beea-dbf70b8cda38-tmp-dir\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.402137 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.402108 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e37b1adb-be11-4c0a-beea-dbf70b8cda38-hosts-file\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.412377 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.412345 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvzd\" (UniqueName: \"kubernetes.io/projected/e37b1adb-be11-4c0a-beea-dbf70b8cda38-kube-api-access-6nvzd\") pod \"node-resolver-tscxv\" (UID: \"e37b1adb-be11-4c0a-beea-dbf70b8cda38\") " pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.502646 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.502602 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:42.502823 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:42.502804 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:42.502881 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:42.502832 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:42.502881 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:42.502848 2548 projected.go:194] Error preparing data for projected volume kube-api-access-kwgfj for pod openshift-network-diagnostics/network-check-target-pk9ff: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:42.502984 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:42.502918 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj podName:2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf nodeName:}" failed. No retries permitted until 2026-04-22 19:57:50.502898902 +0000 UTC m=+18.091390224 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kwgfj" (UniqueName: "kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj") pod "network-check-target-pk9ff" (UID: "2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:42.582460 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.581988 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tscxv" Apr 22 19:57:42.603105 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.603055 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:42.603317 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:42.603228 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:42.603394 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:42.603326 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:50.603303501 +0000 UTC m=+18.191794824 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:42.948428 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.948334 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:42.948565 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:42.948464 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:42.948565 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:42.948508 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:42.948632 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:42.948568 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:44.948150 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:44.948060 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:44.948564 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:44.948082 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:44.948564 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:44.948211 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:44.948564 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:44.948281 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:46.947750 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:46.947305 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:46.947750 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:46.947340 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:46.947750 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:46.947432 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:46.947750 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:46.947563 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:48.947340 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:48.947309 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:48.947340 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:48.947350 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:48.947780 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:48.947436 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:48.947780 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:48.947578 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:50.561724 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:50.561679 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:50.562139 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:50.561824 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:50.562139 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:50.561839 2548 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:50.562139 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:50.561850 2548 projected.go:194] Error preparing data for projected volume kube-api-access-kwgfj for pod openshift-network-diagnostics/network-check-target-pk9ff: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:50.562139 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:50.561918 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj podName:2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf nodeName:}" failed. No retries permitted until 2026-04-22 19:58:06.561898954 +0000 UTC m=+34.150390264 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kwgfj" (UniqueName: "kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj") pod "network-check-target-pk9ff" (UID: "2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:50.662504 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:50.662466 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:50.662688 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:50.662632 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:50.662746 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:50.662714 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:06.662693244 +0000 UTC m=+34.251184555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:50.947824 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:50.947735 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:50.948014 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:50.947894 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:50.948014 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:50.947972 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:50.948108 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:50.948068 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:52.832877 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:57:52.832832 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37b1adb_be11_4c0a_beea_dbf70b8cda38.slice/crio-5f34a0c9c3efb6c242fb9b4c179ff6e50aa809979a5204ba86f04f2832a78afc WatchSource:0}: Error finding container 5f34a0c9c3efb6c242fb9b4c179ff6e50aa809979a5204ba86f04f2832a78afc: Status 404 returned error can't find the container with id 5f34a0c9c3efb6c242fb9b4c179ff6e50aa809979a5204ba86f04f2832a78afc Apr 22 19:57:52.948862 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:52.948832 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:52.948998 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:52.948922 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:52.948998 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:52.948970 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:52.949159 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:52.949024 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:53.028003 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:53.027770 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tscxv" event={"ID":"e37b1adb-be11-4c0a-beea-dbf70b8cda38","Type":"ContainerStarted","Data":"5f34a0c9c3efb6c242fb9b4c179ff6e50aa809979a5204ba86f04f2832a78afc"} Apr 22 19:57:54.031522 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.031339 2548 generic.go:358] "Generic (PLEG): container finished" podID="8071e1d3-8155-4265-9de0-c92543778149" containerID="7bf403320079cfa403ca01a9a79a23f33ae85e481124b7ec042b590fdf858390" exitCode=0 Apr 22 19:57:54.032443 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.031428 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerDied","Data":"7bf403320079cfa403ca01a9a79a23f33ae85e481124b7ec042b590fdf858390"} Apr 22 19:57:54.032726 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.032700 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" event={"ID":"0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76","Type":"ContainerStarted","Data":"7172e0471e09a9641dabf4f2e18e1d384b35fe61149105d81e378ef8f7314ffc"} Apr 22 19:57:54.034073 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.034054 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tscxv" event={"ID":"e37b1adb-be11-4c0a-beea-dbf70b8cda38","Type":"ContainerStarted","Data":"8f1c82e8d6fbe11eac47ee5548a53dd3d7a543946d662455af7e14ad25495020"} Apr 22 19:57:54.035359 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.035340 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" event={"ID":"2c69cc19-8d6a-4316-b289-a47a9ce15bd3","Type":"ContainerStarted","Data":"1217a8fa54e6a9e87195d462677b48c771ed07c973ebb83bf444dab48faa69a2"} Apr 22 19:57:54.037559 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.037542 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 19:57:54.037816 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.037800 2548 generic.go:358] "Generic (PLEG): container finished" podID="3a7bdf57-222a-4e36-b827-d320c2eaaac4" containerID="d04d8eb4836bce66ce31903f4b976e3ad7c2962fa4acb82cdc4bdce356ec8c77" exitCode=1 Apr 22 19:57:54.037884 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.037852 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"944a27d227b00fb0b1fccedcbc494cc03ad8a032aa99de828a344317ad165689"} Apr 22 19:57:54.037884 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.037869 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"8339b7cebd36c3f5fe6fafb898b53788fbf2fc46b284775e5e3a0c08aa89d4ba"} Apr 22 19:57:54.037948 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.037883 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"fd38c960e45fbb900d1e5e81eb09b5db6f0b2ddebff88793b5ae2f4a70c8a904"} Apr 22 19:57:54.037948 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.037895 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerDied","Data":"d04d8eb4836bce66ce31903f4b976e3ad7c2962fa4acb82cdc4bdce356ec8c77"} Apr 22 19:57:54.037948 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.037909 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"97c4c0215cc1e2b65b2553053462453fe78e28e9820c4ee0c3e25a98037420b9"} Apr 22 19:57:54.038954 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.038937 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r6qd8" event={"ID":"9d4bf3cd-4394-40c1-a8fc-3a9c169a083c","Type":"ContainerStarted","Data":"f129e7c27b5a749d1f8a920e213831895018c3a59b83ac861092065b205c51d5"} Apr 22 19:57:54.040134 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.040117 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zkw8l" event={"ID":"ef5f98dc-99df-42ee-b6ba-f81c8f509e56","Type":"ContainerStarted","Data":"2aa725dbdacb5cbe92d0a03be987dbe64994c549e120bc3e9447ba9110944a59"} Apr 22 19:57:54.041169 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.041151 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vqlnc" event={"ID":"dfbb3072-c0b3-48da-8291-55700270a1f3","Type":"ContainerStarted","Data":"a9bf8267812c7102f97594f50f2121a4d76eaef8290b7672e2cba6c4c57336eb"} Apr 22 19:57:54.066629 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.066594 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zkw8l" podStartSLOduration=3.794572056 podStartE2EDuration="21.066582273s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="2026-04-22 19:57:35.623876832 +0000 UTC m=+3.212368142" lastFinishedPulling="2026-04-22 19:57:52.895887047 +0000 UTC m=+20.484378359" observedRunningTime="2026-04-22 19:57:54.066177647 +0000 UTC m=+21.654668970" watchObservedRunningTime="2026-04-22 19:57:54.066582273 +0000 UTC m=+21.655073602" Apr 22 19:57:54.094184 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.094143 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5hz9q" podStartSLOduration=3.833992125 podStartE2EDuration="21.094130451s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="2026-04-22 19:57:35.626578352 +0000 UTC m=+3.215069659" lastFinishedPulling="2026-04-22 19:57:52.886716679 +0000 UTC m=+20.475207985" observedRunningTime="2026-04-22 19:57:54.081454328 +0000 UTC m=+21.669945659" watchObservedRunningTime="2026-04-22 19:57:54.094130451 +0000 UTC m=+21.682621807" Apr 22 19:57:54.107435 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.107396 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tscxv" podStartSLOduration=12.107383186 podStartE2EDuration="12.107383186s" podCreationTimestamp="2026-04-22 19:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:54.0942693 +0000 UTC m=+21.682760610" watchObservedRunningTime="2026-04-22 19:57:54.107383186 +0000 UTC m=+21.695874515" Apr 22 19:57:54.121141 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.121104 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-r6qd8" podStartSLOduration=5.257940662 podStartE2EDuration="22.121091321s" podCreationTimestamp="2026-04-22 19:57:32 +0000 UTC" firstStartedPulling="2026-04-22 19:57:35.628393023 +0000 UTC m=+3.216884344" lastFinishedPulling="2026-04-22 19:57:52.491543695 +0000 UTC m=+20.080035003" observedRunningTime="2026-04-22 19:57:54.121025377 +0000 UTC m=+21.709516706" watchObservedRunningTime="2026-04-22 19:57:54.121091321 +0000 UTC m=+21.709582650" Apr 22 19:57:54.121236 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.121223 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vqlnc" podStartSLOduration=8.652580764 podStartE2EDuration="21.121218713s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="2026-04-22 19:57:35.623415081 +0000 UTC m=+3.211906403" lastFinishedPulling="2026-04-22 19:57:48.09205303 +0000 UTC m=+15.680544352" observedRunningTime="2026-04-22 19:57:54.107118301 +0000 UTC m=+21.695609629" watchObservedRunningTime="2026-04-22 19:57:54.121218713 +0000 UTC m=+21.709710042" Apr 22 19:57:54.422725 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.422496 2548 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:57:54.895025 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.894916 2548 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:57:54.422708292Z","UUID":"ea1f6c54-1db7-4d89-a4d2-cb95d67c166e","Handler":null,"Name":"","Endpoint":""} Apr 22 19:57:54.897576 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.897194 2548 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:57:54.897576 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.897224 2548 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:57:54.948719 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.947410 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:54.948719 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:54.947449 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:54.948719 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:54.947546 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:54.948719 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:54.947933 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:55.046336 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:55.046298 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" event={"ID":"2c69cc19-8d6a-4316-b289-a47a9ce15bd3","Type":"ContainerStarted","Data":"60c7e7e56c1f9431f2dbd58c2f48cd6be4ce56aaf14743e1f81d7e2fbb5cbef0"} Apr 22 19:57:55.047977 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:55.047945 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5xnhn" event={"ID":"41898967-cc03-4d1a-a021-cc3f7817d848","Type":"ContainerStarted","Data":"f29b0ba8129a6c9614bfcce83dfd81a6972e0593369f1df8edabc6eec00bb724"} Apr 22 19:57:55.051434 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:55.051410 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 19:57:55.052369 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:55.051816 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"21dc0845bd6051edeafc8ee86e9662d85864caab282cb7a10c2e26a22815e843"} Apr 22 19:57:55.061535 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:55.061489 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5xnhn" podStartSLOduration=5.857688729 podStartE2EDuration="23.061473838s" podCreationTimestamp="2026-04-22 19:57:32 +0000 UTC" firstStartedPulling="2026-04-22 19:57:35.654823473 +0000 UTC m=+3.243314780" lastFinishedPulling="2026-04-22 19:57:52.858608568 +0000 UTC m=+20.447099889" observedRunningTime="2026-04-22 19:57:55.061163413 +0000 UTC m=+22.649654757" watchObservedRunningTime="2026-04-22 19:57:55.061473838 +0000 UTC m=+22.649965167" Apr 22 19:57:56.056533 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:56.056501 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" event={"ID":"2c69cc19-8d6a-4316-b289-a47a9ce15bd3","Type":"ContainerStarted","Data":"78104179d084d959054719f7f8235a2455a0545eda34b14578b2f292efb302d5"} Apr 22 19:57:56.076988 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:56.075324 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9vshv" podStartSLOduration=3.073672285 podStartE2EDuration="23.075301103s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="2026-04-22 19:57:35.654676529 +0000 UTC m=+3.243167840" lastFinishedPulling="2026-04-22 19:57:55.656305347 +0000 UTC m=+23.244796658" observedRunningTime="2026-04-22 19:57:56.072928052 +0000 UTC m=+23.661419380" watchObservedRunningTime="2026-04-22 19:57:56.075301103 +0000 UTC m=+23.663792433" Apr 22 19:57:56.947692 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:56.947652 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:56.947898 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:56.947670 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:56.947898 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:56.947782 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:56.947898 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:56.947875 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:57.061060 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:57.061028 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 19:57:57.061626 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:57.061488 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"7d39a3aea6f0a8788898aa3c4224778365e4526c919e07bcbe4cd6a49e84bd24"} Apr 22 19:57:57.770675 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:57.770644 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:57.771344 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:57.771315 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:57:58.948154 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:58.947911 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:57:58.948745 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:58.947950 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:57:58.948745 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:58.948293 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:57:58.948745 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:57:58.948312 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:57:59.066333 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:59.066301 2548 generic.go:358] "Generic (PLEG): container finished" podID="8071e1d3-8155-4265-9de0-c92543778149" containerID="51e53d59d050839d5dae4490bbf4877e3116ec0c120e4b140486b2b71cf4e3bd" exitCode=0 Apr 22 19:57:59.066518 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:59.066389 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerDied","Data":"51e53d59d050839d5dae4490bbf4877e3116ec0c120e4b140486b2b71cf4e3bd"} Apr 22 19:57:59.069599 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:59.069573 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 19:57:59.069903 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:59.069882 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"a4cb874b113b5da51eecf11187a37954e25ec1acca078af2b1d0551283fd7478"} Apr 22 19:57:59.070160 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:59.070141 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:59.070210 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:59.070173 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:57:59.070348 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:59.070335 2548 scope.go:117] "RemoveContainer" containerID="d04d8eb4836bce66ce31903f4b976e3ad7c2962fa4acb82cdc4bdce356ec8c77" Apr 22 19:57:59.085749 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:57:59.085729 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:58:00.076233 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.076207 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 19:58:00.076778 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.076601 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" event={"ID":"3a7bdf57-222a-4e36-b827-d320c2eaaac4","Type":"ContainerStarted","Data":"e2f37222d03b686ed73d09bbb1f41763bb94d56e30143dad7c99a9b0a2e5efd4"} Apr 22 19:58:00.077108 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.077076 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:58:00.078861 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.078838 2548 generic.go:358] "Generic (PLEG): container finished" podID="8071e1d3-8155-4265-9de0-c92543778149" containerID="2ab99fb84bc40e04341f62a9bcbe986422f5e0f731f496d85595f0ebe52017fb" exitCode=0 Apr 22 19:58:00.078976 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.078874 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerDied","Data":"2ab99fb84bc40e04341f62a9bcbe986422f5e0f731f496d85595f0ebe52017fb"} Apr 22 19:58:00.094176 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.094153 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:58:00.104077 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.104035 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" podStartSLOduration=9.807480037 podStartE2EDuration="27.104019821s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="2026-04-22 19:57:35.650336685 +0000 UTC m=+3.238827992" lastFinishedPulling="2026-04-22 19:57:52.946876465 +0000 UTC m=+20.535367776" observedRunningTime="2026-04-22 19:58:00.103409671 +0000 UTC m=+27.691901002" watchObservedRunningTime="2026-04-22 19:58:00.104019821 +0000 UTC m=+27.692511149" Apr 22 19:58:00.327627 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.327393 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jjztz"] Apr 22 19:58:00.327779 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.327704 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:58:00.327852 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:00.327829 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:58:00.330103 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.330073 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pk9ff"] Apr 22 19:58:00.330240 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:00.330187 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:00.330332 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:00.330316 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:58:01.082572 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:01.082534 2548 generic.go:358] "Generic (PLEG): container finished" podID="8071e1d3-8155-4265-9de0-c92543778149" containerID="adb66670527633fb37b290a23cced357ea4dece7dd10a56ce9b13a3a4b7a37a3" exitCode=0 Apr 22 19:58:01.082975 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:01.082612 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerDied","Data":"adb66670527633fb37b290a23cced357ea4dece7dd10a56ce9b13a3a4b7a37a3"} Apr 22 19:58:01.948185 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:01.948146 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:01.948451 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:01.948153 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:58:01.948451 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:01.948327 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:58:01.948611 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:01.948497 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:58:03.947709 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:03.947674 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:58:03.947709 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:03.947698 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:03.948494 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:03.947804 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 19:58:03.948494 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:03.947928 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pk9ff" podUID="2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf" Apr 22 19:58:04.485094 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:04.485020 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:58:04.485294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:04.485177 2548 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:58:04.486157 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:04.486131 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-r6qd8" Apr 22 19:58:05.748059 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.748029 2548 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeReady" Apr 22 19:58:05.748713 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.748181 2548 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:58:05.793768 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.793732 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8pzdx"] Apr 22 19:58:05.797507 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.797483 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rr5vp"] Apr 22 19:58:05.797658 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.797643 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.800789 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.800290 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:58:05.800789 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.800395 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:05.800789 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.800483 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tzbrr\"" Apr 22 19:58:05.800789 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.800487 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:58:05.802788 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.802763 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:58:05.802788 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.802769 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:58:05.802966 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.802845 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:58:05.802966 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.802772 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lxbw7\"" Apr 22 19:58:05.805741 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.805463 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8pzdx"] Apr 22 19:58:05.809172 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.809148 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rr5vp"] Apr 22 19:58:05.874504 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.874468 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sjn8\" (UniqueName: \"kubernetes.io/projected/7fa06416-712c-490d-a430-2c086187fab9-kube-api-access-4sjn8\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.874708 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.874526 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fa06416-712c-490d-a430-2c086187fab9-tmp-dir\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.874708 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.874598 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.874708 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.874622 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fa06416-712c-490d-a430-2c086187fab9-config-volume\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.874708 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.874638 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7rxg\" (UniqueName: \"kubernetes.io/projected/26ac7310-bd02-469f-9a0e-31a38e294dc3-kube-api-access-r7rxg\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:05.874708 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.874678 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:05.948035 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.947951 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:05.948195 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.948154 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:58:05.951187 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.951162 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:58:05.951338 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.951323 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:58:05.951412 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.951353 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cbww9\"" Apr 22 19:58:05.951457 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.951428 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:58:05.951500 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.951464 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-79t2x\"" Apr 22 19:58:05.975479 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.975450 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.975624 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.975494 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fa06416-712c-490d-a430-2c086187fab9-config-volume\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.975624 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.975522 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7rxg\" (UniqueName: \"kubernetes.io/projected/26ac7310-bd02-469f-9a0e-31a38e294dc3-kube-api-access-r7rxg\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:05.975624 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.975560 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:05.975792 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:05.975635 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:05.975792 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:05.975647 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:05.975792 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:05.975716 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:06.475697208 +0000 UTC m=+34.064188532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 19:58:05.975792 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:05.975734 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:06.475724487 +0000 UTC m=+34.064215797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 19:58:05.975792 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.975751 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sjn8\" (UniqueName: \"kubernetes.io/projected/7fa06416-712c-490d-a430-2c086187fab9-kube-api-access-4sjn8\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.975792 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.975784 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fa06416-712c-490d-a430-2c086187fab9-tmp-dir\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.976100 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.976036 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fa06416-712c-490d-a430-2c086187fab9-config-volume\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.976100 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.976073 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fa06416-712c-490d-a430-2c086187fab9-tmp-dir\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:05.986556 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.986529 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7rxg\" (UniqueName: \"kubernetes.io/projected/26ac7310-bd02-469f-9a0e-31a38e294dc3-kube-api-access-r7rxg\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:05.986733 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:05.986706 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sjn8\" (UniqueName: \"kubernetes.io/projected/7fa06416-712c-490d-a430-2c086187fab9-kube-api-access-4sjn8\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:06.479535 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:06.479497 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:06.479737 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:06.479561 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:06.479737 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:06.479673 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:06.479844 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:06.479754 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:07.479734017 +0000 UTC m=+35.068225328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 19:58:06.479844 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:06.479763 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:06.479844 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:06.479806 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:07.47979268 +0000 UTC m=+35.068283987 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 19:58:06.580229 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:06.580191 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:06.583506 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:06.583448 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwgfj\" (UniqueName: \"kubernetes.io/projected/2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf-kube-api-access-kwgfj\") pod \"network-check-target-pk9ff\" (UID: \"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf\") " pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:06.680544 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:06.680509 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:58:06.680710 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:06.680668 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:58:06.680752 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:06.680736 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:38.680719142 +0000 UTC m=+66.269210448 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : secret "metrics-daemon-secret" not found Apr 22 19:58:06.860004 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:06.859928 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:07.068019 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:07.067873 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pk9ff"] Apr 22 19:58:07.071207 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:58:07.071177 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d7bbd1b_a8ac_4542_8b29_4dbfeb3569cf.slice/crio-6a3f8e78ad2118e9d5cdde9959d2805e46df59f547f66efc7bb0cbe0c53d5f97 WatchSource:0}: Error finding container 6a3f8e78ad2118e9d5cdde9959d2805e46df59f547f66efc7bb0cbe0c53d5f97: Status 404 returned error can't find the container with id 6a3f8e78ad2118e9d5cdde9959d2805e46df59f547f66efc7bb0cbe0c53d5f97 Apr 22 19:58:07.095309 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:07.095273 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pk9ff" event={"ID":"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf","Type":"ContainerStarted","Data":"6a3f8e78ad2118e9d5cdde9959d2805e46df59f547f66efc7bb0cbe0c53d5f97"} Apr 22 19:58:07.098343 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:07.098306 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerStarted","Data":"12bd76a664f393c2bf4876e66ccdebbecedf46dd22539c55591d87fe85b53690"} Apr 22 19:58:07.489060 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:07.489027 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:07.489227 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:07.489084 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:07.489227 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:07.489191 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:07.489328 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:07.489199 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:07.489328 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:07.489299 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:09.489277657 +0000 UTC m=+37.077768979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 19:58:07.489421 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:07.489354 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:09.489333173 +0000 UTC m=+37.077824495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 19:58:08.103505 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:08.103468 2548 generic.go:358] "Generic (PLEG): container finished" podID="8071e1d3-8155-4265-9de0-c92543778149" containerID="12bd76a664f393c2bf4876e66ccdebbecedf46dd22539c55591d87fe85b53690" exitCode=0 Apr 22 19:58:08.103997 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:08.103547 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerDied","Data":"12bd76a664f393c2bf4876e66ccdebbecedf46dd22539c55591d87fe85b53690"} Apr 22 19:58:09.108804 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:09.108770 2548 generic.go:358] "Generic (PLEG): container finished" podID="8071e1d3-8155-4265-9de0-c92543778149" containerID="227cc5cee5bf012a9368942f2cf13db68c309b9e300817b4d2405710e54ee20f" exitCode=0 Apr 22 19:58:09.109294 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:09.108825 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerDied","Data":"227cc5cee5bf012a9368942f2cf13db68c309b9e300817b4d2405710e54ee20f"} Apr 22 19:58:09.504618 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:09.504577 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:09.504822 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:09.504645 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:09.504822 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:09.504736 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:09.504822 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:09.504747 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:09.504822 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:09.504803 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:13.504784378 +0000 UTC m=+41.093275693 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 19:58:09.504822 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:09.504823 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:13.504813567 +0000 UTC m=+41.093304874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 19:58:10.113525 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:10.113300 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lq662" event={"ID":"8071e1d3-8155-4265-9de0-c92543778149","Type":"ContainerStarted","Data":"40e91ed93e6bacc9bed9858ee7c187477f924975e3d53b910f8dbf5cd760de3f"} Apr 22 19:58:10.135509 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:10.135458 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lq662" podStartSLOduration=5.860660924 podStartE2EDuration="37.135437804s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="2026-04-22 19:57:35.650426075 +0000 UTC m=+3.238917382" lastFinishedPulling="2026-04-22 19:58:06.925202948 +0000 UTC m=+34.513694262" observedRunningTime="2026-04-22 19:58:10.133299564 +0000 UTC m=+37.721790893" watchObservedRunningTime="2026-04-22 19:58:10.135437804 +0000 UTC m=+37.723929133" Apr 22 19:58:11.116880 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:11.116846 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pk9ff" event={"ID":"2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf","Type":"ContainerStarted","Data":"6c306e6ab5ac0bfe99c319633731cc46f1666f568903b1f71554489700070f57"} Apr 22 19:58:11.117367 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:11.117293 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:11.131463 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:11.131420 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pk9ff" podStartSLOduration=35.071602649 podStartE2EDuration="38.13140507s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="2026-04-22 19:58:07.073389234 +0000 UTC m=+34.661880546" lastFinishedPulling="2026-04-22 19:58:10.13319166 +0000 UTC m=+37.721682967" observedRunningTime="2026-04-22 19:58:11.130722881 +0000 UTC m=+38.719214232" watchObservedRunningTime="2026-04-22 19:58:11.13140507 +0000 UTC m=+38.719896399" Apr 22 19:58:13.531022 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:13.530988 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:13.531022 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:13.531039 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:13.531452 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:13.531154 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:13.531452 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:13.531155 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:13.531452 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:13.531214 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:21.531199943 +0000 UTC m=+49.119691250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 19:58:13.531452 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:13.531226 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:21.531220258 +0000 UTC m=+49.119711565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 19:58:21.587308 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:21.587269 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:21.587852 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:21.587327 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:21.587852 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:21.587419 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:21.587852 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:21.587423 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:21.587852 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:21.587482 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:37.587467387 +0000 UTC m=+65.175958694 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 19:58:21.587852 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:21.587494 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:37.587488781 +0000 UTC m=+65.175980087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 19:58:32.095685 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:32.095656 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqc2r" Apr 22 19:58:37.593632 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:37.593590 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:58:37.594084 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:37.593647 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:58:37.594084 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:37.593746 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:37.594084 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:37.593801 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:09.593786531 +0000 UTC m=+97.182277837 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 19:58:37.594084 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:37.593825 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:37.594084 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:37.593913 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:09.593891949 +0000 UTC m=+97.182383259 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 19:58:38.700029 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:38.699947 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:58:38.700438 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:38.700113 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:58:38.700438 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:58:38.700184 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:42.70016766 +0000 UTC m=+130.288658967 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : secret "metrics-daemon-secret" not found Apr 22 19:58:41.819737 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:41.819703 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-zcc7b"] Apr 22 19:58:41.874092 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:41.874060 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zcc7b"] Apr 22 19:58:41.874290 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:41.874183 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:41.877056 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:41.877033 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:58:42.021355 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.021320 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-kubelet-config\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.021533 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.021379 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-dbus\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.021533 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.021398 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-original-pull-secret\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.121826 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.121755 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-kubelet-config\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.121826 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.121808 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-dbus\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.121826 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.121825 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-original-pull-secret\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.122023 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.121894 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-kubelet-config\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.122023 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.121959 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-dbus\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.125749 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.125730 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0ec9b5fe-fc19-49f4-972f-2a642d1424b6-original-pull-secret\") pod \"global-pull-secret-syncer-zcc7b\" (UID: \"0ec9b5fe-fc19-49f4-972f-2a642d1424b6\") " pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.183070 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.183046 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zcc7b" Apr 22 19:58:42.297490 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:42.297461 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zcc7b"] Apr 22 19:58:42.300655 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:58:42.300626 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ec9b5fe_fc19_49f4_972f_2a642d1424b6.slice/crio-c4c64277e31ad62af2baf1224c5c1ad6be678ad60200274e8205cecbbc13664c WatchSource:0}: Error finding container c4c64277e31ad62af2baf1224c5c1ad6be678ad60200274e8205cecbbc13664c: Status 404 returned error can't find the container with id c4c64277e31ad62af2baf1224c5c1ad6be678ad60200274e8205cecbbc13664c Apr 22 19:58:43.123309 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:43.123276 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pk9ff" Apr 22 19:58:43.181716 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:43.181677 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zcc7b" event={"ID":"0ec9b5fe-fc19-49f4-972f-2a642d1424b6","Type":"ContainerStarted","Data":"c4c64277e31ad62af2baf1224c5c1ad6be678ad60200274e8205cecbbc13664c"} Apr 22 19:58:46.189201 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:46.189104 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zcc7b" event={"ID":"0ec9b5fe-fc19-49f4-972f-2a642d1424b6","Type":"ContainerStarted","Data":"bf1e7fbf0f8fe36246e8070b7cf232251ce1f02564ad3b527bef9496addde39e"} Apr 22 19:58:46.204966 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:58:46.204912 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zcc7b" podStartSLOduration=1.61796995 podStartE2EDuration="5.204897524s" podCreationTimestamp="2026-04-22 19:58:41 +0000 UTC" firstStartedPulling="2026-04-22 19:58:42.302519175 +0000 UTC m=+69.891010482" lastFinishedPulling="2026-04-22 19:58:45.889446748 +0000 UTC m=+73.477938056" observedRunningTime="2026-04-22 19:58:46.204301513 +0000 UTC m=+73.792792845" watchObservedRunningTime="2026-04-22 19:58:46.204897524 +0000 UTC m=+73.793388854" Apr 22 19:59:09.617644 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:09.617604 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 19:59:09.618022 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:09.617659 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 19:59:09.618022 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:59:09.617746 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:09.618022 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:59:09.617748 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:09.618022 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:59:09.617814 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:13.617800252 +0000 UTC m=+161.206291564 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 19:59:09.618022 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:59:09.617828 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:13.617821598 +0000 UTC m=+161.206312909 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 19:59:42.749962 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:42.749902 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 19:59:42.750619 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:59:42.750071 2548 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:59:42.750619 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:59:42.750176 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs podName:450a901e-1810-4879-8bc6-97efb2b1c9d9 nodeName:}" failed. No retries permitted until 2026-04-22 20:01:44.750152273 +0000 UTC m=+252.338643611 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs") pod "network-metrics-daemon-jjztz" (UID: "450a901e-1810-4879-8bc6-97efb2b1c9d9") : secret "metrics-daemon-secret" not found Apr 22 19:59:51.566512 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.566476 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kq5tv"] Apr 22 19:59:51.569414 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.569399 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.572182 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.572159 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:59:51.572412 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.572399 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:59:51.573496 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.573463 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-qj8xz\"" Apr 22 19:59:51.573626 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.573491 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:59:51.573626 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.573535 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:59:51.578028 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.578008 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:59:51.580701 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.580682 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kq5tv"] Apr 22 19:59:51.713285 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.713228 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.713285 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.713295 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-service-ca-bundle\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.713518 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.713316 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-serving-cert\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.713518 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.713337 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4bf5\" (UniqueName: \"kubernetes.io/projected/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-kube-api-access-n4bf5\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.713518 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.713360 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-tmp\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.713518 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.713377 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-snapshots\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814095 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814034 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-snapshots\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814281 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814143 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814281 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814161 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-service-ca-bundle\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814281 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814178 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-serving-cert\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814281 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814198 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4bf5\" (UniqueName: \"kubernetes.io/projected/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-kube-api-access-n4bf5\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814281 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814236 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-tmp\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814806 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814775 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-service-ca-bundle\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814913 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814810 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-snapshots\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.814913 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.814809 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-tmp\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.815558 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.815541 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.816849 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.816810 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-serving-cert\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.822784 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.822760 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4bf5\" (UniqueName: \"kubernetes.io/projected/33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd-kube-api-access-n4bf5\") pod \"insights-operator-585dfdc468-kq5tv\" (UID: \"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd\") " pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:51.878800 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:51.878759 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-kq5tv" Apr 22 19:59:52.001134 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:52.001103 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kq5tv"] Apr 22 19:59:52.004897 ip-10-0-139-10 kubenswrapper[2548]: W0422 19:59:52.004866 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ee5e09_49ac_4cdb_a6ea_ebfaeb06c4bd.slice/crio-aa034595b55617c2a9856ac04004c64c83a0604f347e38368effe7f52a58008f WatchSource:0}: Error finding container aa034595b55617c2a9856ac04004c64c83a0604f347e38368effe7f52a58008f: Status 404 returned error can't find the container with id aa034595b55617c2a9856ac04004c64c83a0604f347e38368effe7f52a58008f Apr 22 19:59:52.315668 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:52.315630 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kq5tv" event={"ID":"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd","Type":"ContainerStarted","Data":"aa034595b55617c2a9856ac04004c64c83a0604f347e38368effe7f52a58008f"} Apr 22 19:59:54.320627 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:54.320584 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kq5tv" event={"ID":"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd","Type":"ContainerStarted","Data":"a7c5c2cb4ba97aace3fe64e4bb04f2f85857b7720c69179da2684ece15cfa891"} Apr 22 19:59:54.335340 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:54.335289 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-kq5tv" podStartSLOduration=1.7081881330000002 podStartE2EDuration="3.335273283s" podCreationTimestamp="2026-04-22 19:59:51 +0000 UTC" firstStartedPulling="2026-04-22 19:59:52.009301682 +0000 UTC m=+139.597793003" lastFinishedPulling="2026-04-22 19:59:53.636386845 +0000 UTC m=+141.224878153" observedRunningTime="2026-04-22 19:59:54.334842606 +0000 UTC m=+141.923333932" watchObservedRunningTime="2026-04-22 19:59:54.335273283 +0000 UTC m=+141.923764611" Apr 22 19:59:56.832868 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:56.832836 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tscxv_e37b1adb-be11-4c0a-beea-dbf70b8cda38/dns-node-resolver/0.log" Apr 22 19:59:57.432785 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:57.432755 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vqlnc_dfbb3072-c0b3-48da-8291-55700270a1f3/node-ca/0.log" Apr 22 19:59:59.443353 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.443322 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg"] Apr 22 19:59:59.446475 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.446459 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 19:59:59.448879 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.448846 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:59:59.450077 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.450056 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:59:59.450166 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.450099 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:59:59.450166 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.450065 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-dhlns\"" Apr 22 19:59:59.455923 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.455903 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg"] Apr 22 19:59:59.572538 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.572503 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 19:59:59.572682 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.572586 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klzzz\" (UniqueName: \"kubernetes.io/projected/edf1a1d3-9426-4a01-9072-146ecaba47db-kube-api-access-klzzz\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 19:59:59.673091 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.673050 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 19:59:59.673310 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.673156 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klzzz\" (UniqueName: \"kubernetes.io/projected/edf1a1d3-9426-4a01-9072-146ecaba47db-kube-api-access-klzzz\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 19:59:59.673310 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:59:59.673206 2548 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:59:59.673310 ip-10-0-139-10 kubenswrapper[2548]: E0422 19:59:59.673302 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls podName:edf1a1d3-9426-4a01-9072-146ecaba47db nodeName:}" failed. No retries permitted until 2026-04-22 20:00:00.173284261 +0000 UTC m=+147.761775571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zckhg" (UID: "edf1a1d3-9426-4a01-9072-146ecaba47db") : secret "samples-operator-tls" not found Apr 22 19:59:59.681944 ip-10-0-139-10 kubenswrapper[2548]: I0422 19:59:59.681920 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klzzz\" (UniqueName: \"kubernetes.io/projected/edf1a1d3-9426-4a01-9072-146ecaba47db-kube-api-access-klzzz\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 20:00:00.178214 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:00.178170 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 20:00:00.178410 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:00.178374 2548 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 20:00:00.178457 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:00.178446 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls podName:edf1a1d3-9426-4a01-9072-146ecaba47db nodeName:}" failed. No retries permitted until 2026-04-22 20:00:01.178430107 +0000 UTC m=+148.766921413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zckhg" (UID: "edf1a1d3-9426-4a01-9072-146ecaba47db") : secret "samples-operator-tls" not found Apr 22 20:00:01.186047 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.186008 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 20:00:01.186454 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:01.186162 2548 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 20:00:01.186454 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:01.186230 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls podName:edf1a1d3-9426-4a01-9072-146ecaba47db nodeName:}" failed. No retries permitted until 2026-04-22 20:00:03.186212768 +0000 UTC m=+150.774704075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zckhg" (UID: "edf1a1d3-9426-4a01-9072-146ecaba47db") : secret "samples-operator-tls" not found Apr 22 20:00:01.445641 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.445559 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q"] Apr 22 20:00:01.448481 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.448463 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.451025 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.451000 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 20:00:01.452305 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.452282 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 20:00:01.452440 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.452280 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 20:00:01.452440 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.452338 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 20:00:01.452440 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.452369 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-87ckb\"" Apr 22 20:00:01.458102 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.458072 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q"] Apr 22 20:00:01.545553 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.545514 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc"] Apr 22 20:00:01.548427 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.548410 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.551100 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.551062 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 20:00:01.551100 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.551090 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 20:00:01.551342 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.551122 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 20:00:01.551342 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.551062 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 20:00:01.551342 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.551188 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-sl87h\"" Apr 22 20:00:01.561334 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.561303 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc"] Apr 22 20:00:01.589177 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.589131 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn7qd\" (UniqueName: \"kubernetes.io/projected/27321596-248a-4a3c-b6c9-64b406655f9f-kube-api-access-sn7qd\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.589399 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.589209 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27321596-248a-4a3c-b6c9-64b406655f9f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.589399 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.589314 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27321596-248a-4a3c-b6c9-64b406655f9f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.690665 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.690621 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn7qd\" (UniqueName: \"kubernetes.io/projected/27321596-248a-4a3c-b6c9-64b406655f9f-kube-api-access-sn7qd\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.690870 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.690682 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27321596-248a-4a3c-b6c9-64b406655f9f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.690870 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.690854 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27321596-248a-4a3c-b6c9-64b406655f9f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.690978 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.690886 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e3f2c01-6cd4-497c-88c9-926d537a876c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.690978 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.690906 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3f2c01-6cd4-497c-88c9-926d537a876c-config\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.690978 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.690943 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9z7x\" (UniqueName: \"kubernetes.io/projected/4e3f2c01-6cd4-497c-88c9-926d537a876c-kube-api-access-t9z7x\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.691301 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.691282 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27321596-248a-4a3c-b6c9-64b406655f9f-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.693345 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.693323 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27321596-248a-4a3c-b6c9-64b406655f9f-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.703541 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.703471 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn7qd\" (UniqueName: \"kubernetes.io/projected/27321596-248a-4a3c-b6c9-64b406655f9f-kube-api-access-sn7qd\") pod \"kube-storage-version-migrator-operator-6769c5d45-lth8q\" (UID: \"27321596-248a-4a3c-b6c9-64b406655f9f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.757698 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.757662 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" Apr 22 20:00:01.791915 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.791878 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e3f2c01-6cd4-497c-88c9-926d537a876c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.791915 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.791915 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3f2c01-6cd4-497c-88c9-926d537a876c-config\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.792142 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.791959 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9z7x\" (UniqueName: \"kubernetes.io/projected/4e3f2c01-6cd4-497c-88c9-926d537a876c-kube-api-access-t9z7x\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.792582 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.792556 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3f2c01-6cd4-497c-88c9-926d537a876c-config\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.794937 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.794915 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e3f2c01-6cd4-497c-88c9-926d537a876c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.799895 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.799870 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9z7x\" (UniqueName: \"kubernetes.io/projected/4e3f2c01-6cd4-497c-88c9-926d537a876c-kube-api-access-t9z7x\") pod \"service-ca-operator-d6fc45fc5-f5jgc\" (UID: \"4e3f2c01-6cd4-497c-88c9-926d537a876c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.858741 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.858712 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" Apr 22 20:00:01.874752 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.874637 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q"] Apr 22 20:00:01.877508 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:01.877474 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27321596_248a_4a3c_b6c9_64b406655f9f.slice/crio-410e659896c58c90965f69c0ef3e3ef195a2329e0659bd3fcff56cc0244e58f1 WatchSource:0}: Error finding container 410e659896c58c90965f69c0ef3e3ef195a2329e0659bd3fcff56cc0244e58f1: Status 404 returned error can't find the container with id 410e659896c58c90965f69c0ef3e3ef195a2329e0659bd3fcff56cc0244e58f1 Apr 22 20:00:01.974115 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:01.974040 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc"] Apr 22 20:00:01.977327 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:01.977299 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3f2c01_6cd4_497c_88c9_926d537a876c.slice/crio-0a7479e835b40df165e3af7e34cac661fd87bdddfbb84445e74ff2e65ca12578 WatchSource:0}: Error finding container 0a7479e835b40df165e3af7e34cac661fd87bdddfbb84445e74ff2e65ca12578: Status 404 returned error can't find the container with id 0a7479e835b40df165e3af7e34cac661fd87bdddfbb84445e74ff2e65ca12578 Apr 22 20:00:02.336985 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:02.336887 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" event={"ID":"27321596-248a-4a3c-b6c9-64b406655f9f","Type":"ContainerStarted","Data":"410e659896c58c90965f69c0ef3e3ef195a2329e0659bd3fcff56cc0244e58f1"} Apr 22 20:00:02.337844 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:02.337822 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" event={"ID":"4e3f2c01-6cd4-497c-88c9-926d537a876c","Type":"ContainerStarted","Data":"0a7479e835b40df165e3af7e34cac661fd87bdddfbb84445e74ff2e65ca12578"} Apr 22 20:00:03.204212 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.204170 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 20:00:03.204419 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:03.204381 2548 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 20:00:03.204482 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:03.204463 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls podName:edf1a1d3-9426-4a01-9072-146ecaba47db nodeName:}" failed. No retries permitted until 2026-04-22 20:00:07.204442462 +0000 UTC m=+154.792933774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zckhg" (UID: "edf1a1d3-9426-4a01-9072-146ecaba47db") : secret "samples-operator-tls" not found Apr 22 20:00:03.903306 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.903274 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8c68f5696-nsn5b"] Apr 22 20:00:03.906369 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.906351 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:03.908902 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.908875 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 20:00:03.909048 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.909015 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 20:00:03.909048 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.909033 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 20:00:03.909143 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.909074 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-glwlc\"" Apr 22 20:00:03.914602 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.914568 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 20:00:03.916552 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:03.916486 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8c68f5696-nsn5b"] Apr 22 20:00:04.011176 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.011141 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-bound-sa-token\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.011369 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.011205 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-installation-pull-secrets\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.011369 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.011313 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.011460 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.011374 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-ca-trust-extracted\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.011460 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.011402 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-trusted-ca\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.011460 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.011429 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjkk5\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-kube-api-access-mjkk5\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.011586 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.011485 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-certificates\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.011586 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.011554 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-image-registry-private-configuration\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112271 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112222 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-ca-trust-extracted\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112271 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112270 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-trusted-ca\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112538 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112291 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjkk5\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-kube-api-access-mjkk5\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112538 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112329 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-certificates\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112538 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112405 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-image-registry-private-configuration\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112538 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112474 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-bound-sa-token\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112735 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112552 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-installation-pull-secrets\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112735 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112601 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112916 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.112822 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-ca-trust-extracted\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.112916 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:04.112832 2548 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 20:00:04.112916 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:04.112883 2548 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8c68f5696-nsn5b: secret "image-registry-tls" not found Apr 22 20:00:04.113103 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:04.112979 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls podName:0af35d92-b1c9-40cb-94d1-c461a9bb4cb8 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:04.612951928 +0000 UTC m=+152.201443240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls") pod "image-registry-8c68f5696-nsn5b" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8") : secret "image-registry-tls" not found Apr 22 20:00:04.114292 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.113827 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-certificates\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.114394 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.114322 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-trusted-ca\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.116730 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.116685 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-installation-pull-secrets\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.116730 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.116685 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-image-registry-private-configuration\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.121705 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.121684 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-bound-sa-token\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.121934 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.121913 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjkk5\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-kube-api-access-mjkk5\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.617374 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:04.617331 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:04.617568 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:04.617463 2548 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 20:00:04.617568 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:04.617487 2548 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8c68f5696-nsn5b: secret "image-registry-tls" not found Apr 22 20:00:04.617568 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:04.617558 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls podName:0af35d92-b1c9-40cb-94d1-c461a9bb4cb8 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:05.617535846 +0000 UTC m=+153.206027155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls") pod "image-registry-8c68f5696-nsn5b" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8") : secret "image-registry-tls" not found Apr 22 20:00:05.346731 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:05.346693 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" event={"ID":"27321596-248a-4a3c-b6c9-64b406655f9f","Type":"ContainerStarted","Data":"63842cb37eb26f11720a1df0b017b407d87a0f433483f7276a64d378cfc79e9c"} Apr 22 20:00:05.348123 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:05.348079 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" event={"ID":"4e3f2c01-6cd4-497c-88c9-926d537a876c","Type":"ContainerStarted","Data":"0b0a65b13a8f0a0053aac27b862b087eb817873cf0168960219c2581d15ace3f"} Apr 22 20:00:05.362368 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:05.362314 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" podStartSLOduration=1.786965161 podStartE2EDuration="4.362297453s" podCreationTimestamp="2026-04-22 20:00:01 +0000 UTC" firstStartedPulling="2026-04-22 20:00:01.879490524 +0000 UTC m=+149.467981832" lastFinishedPulling="2026-04-22 20:00:04.454822817 +0000 UTC m=+152.043314124" observedRunningTime="2026-04-22 20:00:05.36150273 +0000 UTC m=+152.949994070" watchObservedRunningTime="2026-04-22 20:00:05.362297453 +0000 UTC m=+152.950788782" Apr 22 20:00:05.380109 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:05.380047 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" podStartSLOduration=1.90193708 podStartE2EDuration="4.380032265s" podCreationTimestamp="2026-04-22 20:00:01 +0000 UTC" firstStartedPulling="2026-04-22 20:00:01.979209074 +0000 UTC m=+149.567700385" lastFinishedPulling="2026-04-22 20:00:04.457304261 +0000 UTC m=+152.045795570" observedRunningTime="2026-04-22 20:00:05.379300224 +0000 UTC m=+152.967791547" watchObservedRunningTime="2026-04-22 20:00:05.380032265 +0000 UTC m=+152.968523594" Apr 22 20:00:05.625929 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:05.625825 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:05.626087 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:05.625978 2548 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 20:00:05.626087 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:05.625999 2548 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8c68f5696-nsn5b: secret "image-registry-tls" not found Apr 22 20:00:05.626087 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:05.626054 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls podName:0af35d92-b1c9-40cb-94d1-c461a9bb4cb8 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:07.626039345 +0000 UTC m=+155.214530652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls") pod "image-registry-8c68f5696-nsn5b" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8") : secret "image-registry-tls" not found Apr 22 20:00:06.197657 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.197622 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw"] Apr 22 20:00:06.200796 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.200779 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw" Apr 22 20:00:06.203585 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.203564 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7kknl\"" Apr 22 20:00:06.209561 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.209530 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw"] Apr 22 20:00:06.331221 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.331177 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6h88\" (UniqueName: \"kubernetes.io/projected/7d2588aa-5729-48a4-bd64-f7fec4d7f0fd-kube-api-access-p6h88\") pod \"network-check-source-8894fc9bd-vrwhw\" (UID: \"7d2588aa-5729-48a4-bd64-f7fec4d7f0fd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw" Apr 22 20:00:06.432168 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.432129 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6h88\" (UniqueName: \"kubernetes.io/projected/7d2588aa-5729-48a4-bd64-f7fec4d7f0fd-kube-api-access-p6h88\") pod \"network-check-source-8894fc9bd-vrwhw\" (UID: \"7d2588aa-5729-48a4-bd64-f7fec4d7f0fd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw" Apr 22 20:00:06.442736 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.442706 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6h88\" (UniqueName: \"kubernetes.io/projected/7d2588aa-5729-48a4-bd64-f7fec4d7f0fd-kube-api-access-p6h88\") pod \"network-check-source-8894fc9bd-vrwhw\" (UID: \"7d2588aa-5729-48a4-bd64-f7fec4d7f0fd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw" Apr 22 20:00:06.510459 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.510408 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw" Apr 22 20:00:06.626754 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:06.626721 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw"] Apr 22 20:00:06.629974 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:06.629937 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d2588aa_5729_48a4_bd64_f7fec4d7f0fd.slice/crio-fe88d50a2236f27ed537185aded4d887e9521a07ab359228e07896ba5d6c4712 WatchSource:0}: Error finding container fe88d50a2236f27ed537185aded4d887e9521a07ab359228e07896ba5d6c4712: Status 404 returned error can't find the container with id fe88d50a2236f27ed537185aded4d887e9521a07ab359228e07896ba5d6c4712 Apr 22 20:00:07.239269 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:07.239211 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 20:00:07.239490 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:07.239373 2548 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 20:00:07.239490 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:07.239450 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls podName:edf1a1d3-9426-4a01-9072-146ecaba47db nodeName:}" failed. No retries permitted until 2026-04-22 20:00:15.239434189 +0000 UTC m=+162.827925495 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zckhg" (UID: "edf1a1d3-9426-4a01-9072-146ecaba47db") : secret "samples-operator-tls" not found Apr 22 20:00:07.353871 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:07.353829 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw" event={"ID":"7d2588aa-5729-48a4-bd64-f7fec4d7f0fd","Type":"ContainerStarted","Data":"c86bd1c01d17ec544107400e8375447a480d59b9e0718cb52305822b41de4491"} Apr 22 20:00:07.353871 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:07.353877 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw" event={"ID":"7d2588aa-5729-48a4-bd64-f7fec4d7f0fd","Type":"ContainerStarted","Data":"fe88d50a2236f27ed537185aded4d887e9521a07ab359228e07896ba5d6c4712"} Apr 22 20:00:07.376633 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:07.376583 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrwhw" podStartSLOduration=1.376567408 podStartE2EDuration="1.376567408s" podCreationTimestamp="2026-04-22 20:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:07.375398021 +0000 UTC m=+154.963889350" watchObservedRunningTime="2026-04-22 20:00:07.376567408 +0000 UTC m=+154.965058737" Apr 22 20:00:07.643623 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:07.643521 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:07.644084 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:07.643726 2548 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 20:00:07.644084 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:07.643754 2548 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8c68f5696-nsn5b: secret "image-registry-tls" not found Apr 22 20:00:07.644084 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:07.643834 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls podName:0af35d92-b1c9-40cb-94d1-c461a9bb4cb8 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:11.643812617 +0000 UTC m=+159.232303926 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls") pod "image-registry-8c68f5696-nsn5b" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8") : secret "image-registry-tls" not found Apr 22 20:00:08.102061 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.102022 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-g2p6k"] Apr 22 20:00:08.105637 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.105611 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.108453 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.108270 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 20:00:08.108453 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.108354 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 20:00:08.108453 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.108406 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5prhs\"" Apr 22 20:00:08.113653 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.113604 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g2p6k"] Apr 22 20:00:08.249046 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.249003 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c66bba96-eae4-461e-b602-745b3cb8cef1-crio-socket\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.249046 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.249045 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c66bba96-eae4-461e-b602-745b3cb8cef1-data-volume\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.249279 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.249119 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.249279 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.249175 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c66bba96-eae4-461e-b602-745b3cb8cef1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.249279 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.249214 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms227\" (UniqueName: \"kubernetes.io/projected/c66bba96-eae4-461e-b602-745b3cb8cef1-kube-api-access-ms227\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.350570 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.350521 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c66bba96-eae4-461e-b602-745b3cb8cef1-crio-socket\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.350570 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.350574 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c66bba96-eae4-461e-b602-745b3cb8cef1-data-volume\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.350751 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.350664 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c66bba96-eae4-461e-b602-745b3cb8cef1-crio-socket\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.350751 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.350707 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.350826 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.350759 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c66bba96-eae4-461e-b602-745b3cb8cef1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.350826 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.350792 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms227\" (UniqueName: \"kubernetes.io/projected/c66bba96-eae4-461e-b602-745b3cb8cef1-kube-api-access-ms227\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.350914 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:08.350834 2548 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:08.350914 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.350890 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c66bba96-eae4-461e-b602-745b3cb8cef1-data-volume\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.350995 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:08.350898 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls podName:c66bba96-eae4-461e-b602-745b3cb8cef1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:08.850880397 +0000 UTC m=+156.439371704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls") pod "insights-runtime-extractor-g2p6k" (UID: "c66bba96-eae4-461e-b602-745b3cb8cef1") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:08.351213 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.351198 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c66bba96-eae4-461e-b602-745b3cb8cef1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.364669 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.364599 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms227\" (UniqueName: \"kubernetes.io/projected/c66bba96-eae4-461e-b602-745b3cb8cef1-kube-api-access-ms227\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.814344 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:08.814295 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8pzdx" podUID="7fa06416-712c-490d-a430-2c086187fab9" Apr 22 20:00:08.820433 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:08.820401 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rr5vp" podUID="26ac7310-bd02-469f-9a0e-31a38e294dc3" Apr 22 20:00:08.854891 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:08.854848 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:08.855018 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:08.855000 2548 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:08.855076 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:08.855066 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls podName:c66bba96-eae4-461e-b602-745b3cb8cef1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:09.855048601 +0000 UTC m=+157.443539911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls") pod "insights-runtime-extractor-g2p6k" (UID: "c66bba96-eae4-461e-b602-745b3cb8cef1") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:08.964598 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:08.964557 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jjztz" podUID="450a901e-1810-4879-8bc6-97efb2b1c9d9" Apr 22 20:00:09.358382 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:09.358346 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 20:00:09.358382 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:09.358366 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pzdx" Apr 22 20:00:09.862050 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:09.862015 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:09.862459 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:09.862144 2548 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:09.862459 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:09.862211 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls podName:c66bba96-eae4-461e-b602-745b3cb8cef1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:11.862193291 +0000 UTC m=+159.450684603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls") pod "insights-runtime-extractor-g2p6k" (UID: "c66bba96-eae4-461e-b602-745b3cb8cef1") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:11.674670 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:11.674629 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:11.675123 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:11.674791 2548 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 20:00:11.675123 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:11.674816 2548 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8c68f5696-nsn5b: secret "image-registry-tls" not found Apr 22 20:00:11.675123 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:11.674893 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls podName:0af35d92-b1c9-40cb-94d1-c461a9bb4cb8 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:19.674867427 +0000 UTC m=+167.263358734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls") pod "image-registry-8c68f5696-nsn5b" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8") : secret "image-registry-tls" not found Apr 22 20:00:11.876225 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:11.876184 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:11.876409 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:11.876347 2548 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:11.876483 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:11.876414 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls podName:c66bba96-eae4-461e-b602-745b3cb8cef1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:15.876398473 +0000 UTC m=+163.464889780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls") pod "insights-runtime-extractor-g2p6k" (UID: "c66bba96-eae4-461e-b602-745b3cb8cef1") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:13.693474 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:13.693422 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 20:00:13.693474 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:13.693481 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 20:00:13.693893 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:13.693568 2548 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 20:00:13.693893 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:13.693573 2548 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 20:00:13.693893 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:13.693620 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert podName:26ac7310-bd02-469f-9a0e-31a38e294dc3 nodeName:}" failed. No retries permitted until 2026-04-22 20:02:15.693605294 +0000 UTC m=+283.282096601 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert") pod "ingress-canary-rr5vp" (UID: "26ac7310-bd02-469f-9a0e-31a38e294dc3") : secret "canary-serving-cert" not found Apr 22 20:00:13.693893 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:13.693648 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls podName:7fa06416-712c-490d-a430-2c086187fab9 nodeName:}" failed. No retries permitted until 2026-04-22 20:02:15.693635232 +0000 UTC m=+283.282126540 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls") pod "dns-default-8pzdx" (UID: "7fa06416-712c-490d-a430-2c086187fab9") : secret "dns-default-metrics-tls" not found Apr 22 20:00:15.307802 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:15.307753 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 20:00:15.310452 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:15.310426 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edf1a1d3-9426-4a01-9072-146ecaba47db-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zckhg\" (UID: \"edf1a1d3-9426-4a01-9072-146ecaba47db\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 20:00:15.355087 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:15.355051 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" Apr 22 20:00:15.476349 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:15.476307 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg"] Apr 22 20:00:15.913841 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:15.913804 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:15.914027 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:15.913970 2548 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 20:00:15.914072 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:15.914043 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls podName:c66bba96-eae4-461e-b602-745b3cb8cef1 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:23.91402427 +0000 UTC m=+171.502515577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls") pod "insights-runtime-extractor-g2p6k" (UID: "c66bba96-eae4-461e-b602-745b3cb8cef1") : secret "insights-runtime-extractor-tls" not found Apr 22 20:00:16.377346 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:16.377303 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" event={"ID":"edf1a1d3-9426-4a01-9072-146ecaba47db","Type":"ContainerStarted","Data":"da8805c002e6b5ef3e1163e354cdec5034b221dc3fc83beaefc8a6c0107a19cb"} Apr 22 20:00:17.381340 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:17.381229 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" event={"ID":"edf1a1d3-9426-4a01-9072-146ecaba47db","Type":"ContainerStarted","Data":"89ddc1c369ac5249594d363aff13ba6006f59912f81ee99d6045f4bb97dab1b6"} Apr 22 20:00:17.381340 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:17.381287 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" event={"ID":"edf1a1d3-9426-4a01-9072-146ecaba47db","Type":"ContainerStarted","Data":"7be2d4c4ed36ffab1e22f7338da2ebbbcfa7e852b16ee060aed0f76db321ddc1"} Apr 22 20:00:17.397369 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:17.397325 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zckhg" podStartSLOduration=16.904631549 podStartE2EDuration="18.397311456s" podCreationTimestamp="2026-04-22 19:59:59 +0000 UTC" firstStartedPulling="2026-04-22 20:00:15.519178863 +0000 UTC m=+163.107670174" lastFinishedPulling="2026-04-22 20:00:17.011858766 +0000 UTC m=+164.600350081" observedRunningTime="2026-04-22 20:00:17.396444406 +0000 UTC m=+164.984935734" watchObservedRunningTime="2026-04-22 20:00:17.397311456 +0000 UTC m=+164.985802784" Apr 22 20:00:19.752726 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:19.752678 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:19.755216 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:19.755193 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"image-registry-8c68f5696-nsn5b\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:19.816223 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:19.816188 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:19.939679 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:19.939646 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8c68f5696-nsn5b"] Apr 22 20:00:19.942805 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:19.942769 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af35d92_b1c9_40cb_94d1_c461a9bb4cb8.slice/crio-4c2210facf71b4f19530b441bb2ed96d937b12f8fa6262030ff351b0681f514b WatchSource:0}: Error finding container 4c2210facf71b4f19530b441bb2ed96d937b12f8fa6262030ff351b0681f514b: Status 404 returned error can't find the container with id 4c2210facf71b4f19530b441bb2ed96d937b12f8fa6262030ff351b0681f514b Apr 22 20:00:20.390851 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:20.390757 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" event={"ID":"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8","Type":"ContainerStarted","Data":"09bcf4fa5065f6eb60e062fc5ad2b107230a52043e8d2588fcd174f42e5aa8b8"} Apr 22 20:00:20.390851 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:20.390803 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" event={"ID":"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8","Type":"ContainerStarted","Data":"4c2210facf71b4f19530b441bb2ed96d937b12f8fa6262030ff351b0681f514b"} Apr 22 20:00:20.391050 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:20.390931 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:20.414670 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:20.414622 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" podStartSLOduration=17.414606331 podStartE2EDuration="17.414606331s" podCreationTimestamp="2026-04-22 20:00:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:20.413561185 +0000 UTC m=+168.002052515" watchObservedRunningTime="2026-04-22 20:00:20.414606331 +0000 UTC m=+168.003097659" Apr 22 20:00:21.947761 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:21.947724 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 20:00:23.987775 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:23.987724 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:23.990216 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:23.990193 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c66bba96-eae4-461e-b602-745b3cb8cef1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g2p6k\" (UID: \"c66bba96-eae4-461e-b602-745b3cb8cef1\") " pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:24.016264 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:24.016218 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g2p6k" Apr 22 20:00:24.149977 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:24.149944 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g2p6k"] Apr 22 20:00:24.153019 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:24.152983 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc66bba96_eae4_461e_b602_745b3cb8cef1.slice/crio-7b6aac6d1e94556efc4b51b26d1c8049cc245ef1fa125ac91fa3e17af0126d74 WatchSource:0}: Error finding container 7b6aac6d1e94556efc4b51b26d1c8049cc245ef1fa125ac91fa3e17af0126d74: Status 404 returned error can't find the container with id 7b6aac6d1e94556efc4b51b26d1c8049cc245ef1fa125ac91fa3e17af0126d74 Apr 22 20:00:24.402122 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:24.402030 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g2p6k" event={"ID":"c66bba96-eae4-461e-b602-745b3cb8cef1","Type":"ContainerStarted","Data":"b9b25a70a6a04caf007ae6f41b38e8885d5a8e5569e8a0295cebe01dbb4a39b7"} Apr 22 20:00:24.402122 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:24.402072 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g2p6k" event={"ID":"c66bba96-eae4-461e-b602-745b3cb8cef1","Type":"ContainerStarted","Data":"7b6aac6d1e94556efc4b51b26d1c8049cc245ef1fa125ac91fa3e17af0126d74"} Apr 22 20:00:25.406214 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:25.406170 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g2p6k" event={"ID":"c66bba96-eae4-461e-b602-745b3cb8cef1","Type":"ContainerStarted","Data":"a44f7d22da6e489134acbb0af7c174d9a231619a2644751447d905d10cf2db92"} Apr 22 20:00:26.352687 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.352657 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8c68f5696-nsn5b"] Apr 22 20:00:26.379414 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.379389 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7dfdc46845-k74fd"] Apr 22 20:00:26.384358 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.384338 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.392175 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.392154 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7dfdc46845-k74fd"] Apr 22 20:00:26.509879 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.509836 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9mtx\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-kube-api-access-w9mtx\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.510343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.509973 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/910df736-6a06-4200-9361-0fcde69a47e1-trusted-ca\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.510343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.510004 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-bound-sa-token\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.510343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.510042 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/910df736-6a06-4200-9361-0fcde69a47e1-image-registry-private-configuration\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.510343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.510072 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/910df736-6a06-4200-9361-0fcde69a47e1-installation-pull-secrets\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.510343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.510143 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-registry-tls\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.510343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.510171 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/910df736-6a06-4200-9361-0fcde69a47e1-registry-certificates\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.510343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.510194 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/910df736-6a06-4200-9361-0fcde69a47e1-ca-trust-extracted\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.610846 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.610749 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-registry-tls\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.610846 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.610787 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/910df736-6a06-4200-9361-0fcde69a47e1-registry-certificates\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.610846 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.610806 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/910df736-6a06-4200-9361-0fcde69a47e1-ca-trust-extracted\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.610846 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.610832 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9mtx\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-kube-api-access-w9mtx\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.611156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.610937 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/910df736-6a06-4200-9361-0fcde69a47e1-trusted-ca\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.611156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.610962 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-bound-sa-token\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.611156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.610990 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/910df736-6a06-4200-9361-0fcde69a47e1-image-registry-private-configuration\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.611156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.611019 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/910df736-6a06-4200-9361-0fcde69a47e1-installation-pull-secrets\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.611390 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.611347 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/910df736-6a06-4200-9361-0fcde69a47e1-ca-trust-extracted\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.611822 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.611765 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/910df736-6a06-4200-9361-0fcde69a47e1-registry-certificates\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.612228 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.612206 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/910df736-6a06-4200-9361-0fcde69a47e1-trusted-ca\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.613647 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.613624 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/910df736-6a06-4200-9361-0fcde69a47e1-image-registry-private-configuration\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.613647 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.613644 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-registry-tls\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.613790 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.613770 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/910df736-6a06-4200-9361-0fcde69a47e1-installation-pull-secrets\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.624957 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.624932 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9mtx\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-kube-api-access-w9mtx\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.624957 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.624951 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/910df736-6a06-4200-9361-0fcde69a47e1-bound-sa-token\") pod \"image-registry-7dfdc46845-k74fd\" (UID: \"910df736-6a06-4200-9361-0fcde69a47e1\") " pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.702990 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.702954 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:26.824368 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:26.824335 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7dfdc46845-k74fd"] Apr 22 20:00:26.827852 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:26.827823 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod910df736_6a06_4200_9361_0fcde69a47e1.slice/crio-653aaa6fe103c3f03611b7a4765407031fc61e56a97ee6c4efe95718047caefb WatchSource:0}: Error finding container 653aaa6fe103c3f03611b7a4765407031fc61e56a97ee6c4efe95718047caefb: Status 404 returned error can't find the container with id 653aaa6fe103c3f03611b7a4765407031fc61e56a97ee6c4efe95718047caefb Apr 22 20:00:27.413217 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:27.413172 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" event={"ID":"910df736-6a06-4200-9361-0fcde69a47e1","Type":"ContainerStarted","Data":"a04f3f509f78e68573f5343a0e143a404cb4d73ab190a78f23279ac18aeb0232"} Apr 22 20:00:27.413217 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:27.413215 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" event={"ID":"910df736-6a06-4200-9361-0fcde69a47e1","Type":"ContainerStarted","Data":"653aaa6fe103c3f03611b7a4765407031fc61e56a97ee6c4efe95718047caefb"} Apr 22 20:00:27.413680 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:27.413234 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:27.414930 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:27.414908 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g2p6k" event={"ID":"c66bba96-eae4-461e-b602-745b3cb8cef1","Type":"ContainerStarted","Data":"58c694791fdd39181c5163b8993eb8bdb690d118cff00a66981dc4db1040a114"} Apr 22 20:00:27.431131 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:27.431083 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" podStartSLOduration=1.431068788 podStartE2EDuration="1.431068788s" podCreationTimestamp="2026-04-22 20:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:27.430465067 +0000 UTC m=+175.018956425" watchObservedRunningTime="2026-04-22 20:00:27.431068788 +0000 UTC m=+175.019560157" Apr 22 20:00:27.446894 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:27.446843 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-g2p6k" podStartSLOduration=17.292613867 podStartE2EDuration="19.446829032s" podCreationTimestamp="2026-04-22 20:00:08 +0000 UTC" firstStartedPulling="2026-04-22 20:00:24.206111675 +0000 UTC m=+171.794602985" lastFinishedPulling="2026-04-22 20:00:26.360326842 +0000 UTC m=+173.948818150" observedRunningTime="2026-04-22 20:00:27.445976775 +0000 UTC m=+175.034468104" watchObservedRunningTime="2026-04-22 20:00:27.446829032 +0000 UTC m=+175.035320360" Apr 22 20:00:33.794689 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.794654 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-l4m6g"] Apr 22 20:00:33.856168 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.856138 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.859128 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.859103 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 20:00:33.859285 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.859241 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 20:00:33.859434 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.859407 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vfgt6\"" Apr 22 20:00:33.859536 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.859493 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 20:00:33.859536 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.859507 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 20:00:33.859616 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.859508 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 20:00:33.860232 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.860213 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 20:00:33.974036 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974003 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-wtmp\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.974036 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974040 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7klh\" (UniqueName: \"kubernetes.io/projected/f2f0b7a0-1bba-4840-81dd-1944c681644b-kube-api-access-x7klh\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.974227 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974064 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.974227 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974081 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-sys\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.974227 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974148 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-root\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.974227 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974190 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2f0b7a0-1bba-4840-81dd-1944c681644b-metrics-client-ca\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.974227 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974211 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-accelerators-collector-config\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.974397 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974241 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-textfile\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:33.974397 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:33.974287 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-tls\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075346 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075240 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-root\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075346 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075335 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2f0b7a0-1bba-4840-81dd-1944c681644b-metrics-client-ca\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075563 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075346 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-root\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075563 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075373 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-accelerators-collector-config\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075563 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075416 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-textfile\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075563 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075448 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-tls\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075563 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:34.075550 2548 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 20:00:34.075803 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:00:34.075610 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-tls podName:f2f0b7a0-1bba-4840-81dd-1944c681644b nodeName:}" failed. No retries permitted until 2026-04-22 20:00:34.575591227 +0000 UTC m=+182.164082541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-tls") pod "node-exporter-l4m6g" (UID: "f2f0b7a0-1bba-4840-81dd-1944c681644b") : secret "node-exporter-tls" not found Apr 22 20:00:34.075803 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075726 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-wtmp\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075803 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075775 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7klh\" (UniqueName: \"kubernetes.io/projected/f2f0b7a0-1bba-4840-81dd-1944c681644b-kube-api-access-x7klh\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075803 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075780 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-textfile\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075992 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075820 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075992 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075854 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-sys\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075992 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075912 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-wtmp\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075992 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075933 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2f0b7a0-1bba-4840-81dd-1944c681644b-sys\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.075992 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.075940 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2f0b7a0-1bba-4840-81dd-1944c681644b-metrics-client-ca\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.076190 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.076032 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-accelerators-collector-config\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.078378 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.078360 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.084370 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.084349 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7klh\" (UniqueName: \"kubernetes.io/projected/f2f0b7a0-1bba-4840-81dd-1944c681644b-kube-api-access-x7klh\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.581041 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.581004 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-tls\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.583540 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.583499 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2f0b7a0-1bba-4840-81dd-1944c681644b-node-exporter-tls\") pod \"node-exporter-l4m6g\" (UID: \"f2f0b7a0-1bba-4840-81dd-1944c681644b\") " pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.765773 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.765741 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l4m6g" Apr 22 20:00:34.775429 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:34.775390 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2f0b7a0_1bba_4840_81dd_1944c681644b.slice/crio-495a98984833b846fefdfecd9f229b74673b25cf892a3a943d0494cad69767ff WatchSource:0}: Error finding container 495a98984833b846fefdfecd9f229b74673b25cf892a3a943d0494cad69767ff: Status 404 returned error can't find the container with id 495a98984833b846fefdfecd9f229b74673b25cf892a3a943d0494cad69767ff Apr 22 20:00:34.849686 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.849604 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:34.910578 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.910546 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:34.910743 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.910663 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.913847 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.913827 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 20:00:34.913982 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.913958 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 20:00:34.914229 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.914209 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 20:00:34.914372 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.914220 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 20:00:34.914372 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.914285 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 20:00:34.914372 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.914271 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 20:00:34.914372 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.914352 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-967ml\"" Apr 22 20:00:34.914544 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.914475 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 20:00:34.914594 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.914573 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 20:00:34.914819 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.914802 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 20:00:34.983843 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.983808 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tszfl\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-kube-api-access-tszfl\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.983843 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.983847 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984048 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.983874 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984048 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.983957 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-web-config\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984048 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984016 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984048 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984038 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984170 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984073 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984170 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984107 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-config-out\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984170 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984139 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984295 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984197 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-config-volume\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984295 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984219 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984295 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984244 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:34.984295 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:34.984287 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.084955 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.084916 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-config-volume\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.084955 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.084955 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.084983 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085002 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085025 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tszfl\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-kube-api-access-tszfl\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085043 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085075 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085104 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-web-config\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085134 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085163 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085193 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085619 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085234 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-config-out\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.085619 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.085281 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.086376 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.086349 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.087149 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.086857 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.087622 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.087557 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.088434 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.088412 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.088738 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.088712 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.088897 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.088876 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.088968 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.088899 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-config-volume\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.089037 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.089017 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.089243 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.089221 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-web-config\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.089553 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.089534 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.090348 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.090331 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-config-out\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.090647 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.090633 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.095777 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.095757 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tszfl\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-kube-api-access-tszfl\") pod \"alertmanager-main-0\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.223153 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.223072 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:35.359788 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.359749 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:35.365566 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:35.365528 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1551003_1ea5_4cab_a68c_6d06d126f177.slice/crio-3fff5f8bc9bca18e39286ade52e70ad2d3a2c9c2a8c5e9d4427bc3223d523747 WatchSource:0}: Error finding container 3fff5f8bc9bca18e39286ade52e70ad2d3a2c9c2a8c5e9d4427bc3223d523747: Status 404 returned error can't find the container with id 3fff5f8bc9bca18e39286ade52e70ad2d3a2c9c2a8c5e9d4427bc3223d523747 Apr 22 20:00:35.436424 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.436385 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerStarted","Data":"3fff5f8bc9bca18e39286ade52e70ad2d3a2c9c2a8c5e9d4427bc3223d523747"} Apr 22 20:00:35.437380 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:35.437357 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l4m6g" event={"ID":"f2f0b7a0-1bba-4840-81dd-1944c681644b","Type":"ContainerStarted","Data":"495a98984833b846fefdfecd9f229b74673b25cf892a3a943d0494cad69767ff"} Apr 22 20:00:36.359421 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:36.359396 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:36.442152 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:36.442114 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l4m6g" event={"ID":"f2f0b7a0-1bba-4840-81dd-1944c681644b","Type":"ContainerStarted","Data":"5270e3d9f0548aead75f30f8d6b258f10414b0fbabd277726e85577bf9b607c4"} Apr 22 20:00:37.446149 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:37.446114 2548 generic.go:358] "Generic (PLEG): container finished" podID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerID="43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014" exitCode=0 Apr 22 20:00:37.446605 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:37.446195 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerDied","Data":"43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014"} Apr 22 20:00:37.447551 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:37.447526 2548 generic.go:358] "Generic (PLEG): container finished" podID="f2f0b7a0-1bba-4840-81dd-1944c681644b" containerID="5270e3d9f0548aead75f30f8d6b258f10414b0fbabd277726e85577bf9b607c4" exitCode=0 Apr 22 20:00:37.447666 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:37.447586 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l4m6g" event={"ID":"f2f0b7a0-1bba-4840-81dd-1944c681644b","Type":"ContainerDied","Data":"5270e3d9f0548aead75f30f8d6b258f10414b0fbabd277726e85577bf9b607c4"} Apr 22 20:00:38.453206 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:38.453165 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l4m6g" event={"ID":"f2f0b7a0-1bba-4840-81dd-1944c681644b","Type":"ContainerStarted","Data":"650e23f53eb53accd666c546f935232ebc2b9b91aa25b26bc1525219867f52e6"} Apr 22 20:00:38.453206 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:38.453212 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l4m6g" event={"ID":"f2f0b7a0-1bba-4840-81dd-1944c681644b","Type":"ContainerStarted","Data":"3cae9a5a39ea62e281e8b78d803f14e5831439be73784cc62d437208a18c44cf"} Apr 22 20:00:38.473614 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:38.473550 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-l4m6g" podStartSLOduration=4.01404437 podStartE2EDuration="5.473530806s" podCreationTimestamp="2026-04-22 20:00:33 +0000 UTC" firstStartedPulling="2026-04-22 20:00:34.777075349 +0000 UTC m=+182.365566658" lastFinishedPulling="2026-04-22 20:00:36.236561774 +0000 UTC m=+183.825053094" observedRunningTime="2026-04-22 20:00:38.47112901 +0000 UTC m=+186.059620354" watchObservedRunningTime="2026-04-22 20:00:38.473530806 +0000 UTC m=+186.062022139" Apr 22 20:00:39.459995 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:39.459953 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerStarted","Data":"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556"} Apr 22 20:00:39.459995 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:39.459997 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerStarted","Data":"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d"} Apr 22 20:00:39.460572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:39.460010 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerStarted","Data":"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5"} Apr 22 20:00:39.460572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:39.460022 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerStarted","Data":"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13"} Apr 22 20:00:39.460572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:39.460032 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerStarted","Data":"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c"} Apr 22 20:00:40.465073 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:40.465029 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerStarted","Data":"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488"} Apr 22 20:00:40.493459 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:40.491412 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.17790395 podStartE2EDuration="6.491393854s" podCreationTimestamp="2026-04-22 20:00:34 +0000 UTC" firstStartedPulling="2026-04-22 20:00:35.367834081 +0000 UTC m=+182.956325402" lastFinishedPulling="2026-04-22 20:00:39.681323997 +0000 UTC m=+187.269815306" observedRunningTime="2026-04-22 20:00:40.489710209 +0000 UTC m=+188.078201574" watchObservedRunningTime="2026-04-22 20:00:40.491393854 +0000 UTC m=+188.079885184" Apr 22 20:00:48.421643 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:48.421608 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7dfdc46845-k74fd" Apr 22 20:00:51.372283 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.372205 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" podUID="0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" containerName="registry" containerID="cri-o://09bcf4fa5065f6eb60e062fc5ad2b107230a52043e8d2588fcd174f42e5aa8b8" gracePeriod=30 Apr 22 20:00:51.495690 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.495661 2548 generic.go:358] "Generic (PLEG): container finished" podID="0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" containerID="09bcf4fa5065f6eb60e062fc5ad2b107230a52043e8d2588fcd174f42e5aa8b8" exitCode=0 Apr 22 20:00:51.495819 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.495717 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" event={"ID":"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8","Type":"ContainerDied","Data":"09bcf4fa5065f6eb60e062fc5ad2b107230a52043e8d2588fcd174f42e5aa8b8"} Apr 22 20:00:51.612295 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.612271 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:51.633818 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.633750 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjkk5\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-kube-api-access-mjkk5\") pod \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " Apr 22 20:00:51.633818 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.633792 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-certificates\") pod \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " Apr 22 20:00:51.634004 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.633841 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-image-registry-private-configuration\") pod \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " Apr 22 20:00:51.634004 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.633866 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-installation-pull-secrets\") pod \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " Apr 22 20:00:51.634004 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.633898 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-bound-sa-token\") pod \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " Apr 22 20:00:51.634004 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.633924 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") pod \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " Apr 22 20:00:51.634004 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.633950 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-trusted-ca\") pod \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " Apr 22 20:00:51.634230 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.634017 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-ca-trust-extracted\") pod \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\" (UID: \"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8\") " Apr 22 20:00:51.634855 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.634323 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:51.634855 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.634816 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:51.636990 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.636945 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-kube-api-access-mjkk5" (OuterVolumeSpecName: "kube-api-access-mjkk5") pod "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8"). InnerVolumeSpecName "kube-api-access-mjkk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:51.637583 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.637557 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:51.637779 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.637756 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:51.637979 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.637946 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:51.638077 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.638022 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:51.643641 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.643611 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" (UID: "0af35d92-b1c9-40cb-94d1-c461a9bb4cb8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:51.734799 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.734762 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjkk5\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-kube-api-access-mjkk5\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:00:51.734799 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.734791 2548 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-certificates\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:00:51.734799 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.734801 2548 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-image-registry-private-configuration\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:00:51.735029 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.734812 2548 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-installation-pull-secrets\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:00:51.735029 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.734823 2548 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-bound-sa-token\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:00:51.735029 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.734832 2548 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-registry-tls\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:00:51.735029 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.734840 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-trusted-ca\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:00:51.735029 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:51.734848 2548 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8-ca-trust-extracted\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:00:52.499448 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:52.499415 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" event={"ID":"0af35d92-b1c9-40cb-94d1-c461a9bb4cb8","Type":"ContainerDied","Data":"4c2210facf71b4f19530b441bb2ed96d937b12f8fa6262030ff351b0681f514b"} Apr 22 20:00:52.499883 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:52.499462 2548 scope.go:117] "RemoveContainer" containerID="09bcf4fa5065f6eb60e062fc5ad2b107230a52043e8d2588fcd174f42e5aa8b8" Apr 22 20:00:52.499883 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:52.499463 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8c68f5696-nsn5b" Apr 22 20:00:52.519772 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:52.519743 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8c68f5696-nsn5b"] Apr 22 20:00:52.522668 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:52.522647 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8c68f5696-nsn5b"] Apr 22 20:00:52.951136 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:52.951043 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" path="/var/lib/kubelet/pods/0af35d92-b1c9-40cb-94d1-c461a9bb4cb8/volumes" Apr 22 20:00:53.894922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.894885 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-wgmf6"] Apr 22 20:00:53.895422 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.895238 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" containerName="registry" Apr 22 20:00:53.895422 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.895276 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" containerName="registry" Apr 22 20:00:53.895422 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.895362 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="0af35d92-b1c9-40cb-94d1-c461a9bb4cb8" containerName="registry" Apr 22 20:00:53.898421 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.898402 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wgmf6" Apr 22 20:00:53.901018 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.900996 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 20:00:53.901018 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.901013 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-6ttlm\"" Apr 22 20:00:53.901342 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.901307 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 20:00:53.908232 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.908207 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wgmf6"] Apr 22 20:00:53.949609 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:53.949574 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jklj\" (UniqueName: \"kubernetes.io/projected/cdf0fb12-754a-46f7-a133-7cb4a81e6bdb-kube-api-access-4jklj\") pod \"downloads-6bcc868b7-wgmf6\" (UID: \"cdf0fb12-754a-46f7-a133-7cb4a81e6bdb\") " pod="openshift-console/downloads-6bcc868b7-wgmf6" Apr 22 20:00:54.050307 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:54.050265 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jklj\" (UniqueName: \"kubernetes.io/projected/cdf0fb12-754a-46f7-a133-7cb4a81e6bdb-kube-api-access-4jklj\") pod \"downloads-6bcc868b7-wgmf6\" (UID: \"cdf0fb12-754a-46f7-a133-7cb4a81e6bdb\") " pod="openshift-console/downloads-6bcc868b7-wgmf6" Apr 22 20:00:54.057949 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:54.057922 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jklj\" (UniqueName: \"kubernetes.io/projected/cdf0fb12-754a-46f7-a133-7cb4a81e6bdb-kube-api-access-4jklj\") pod \"downloads-6bcc868b7-wgmf6\" (UID: \"cdf0fb12-754a-46f7-a133-7cb4a81e6bdb\") " pod="openshift-console/downloads-6bcc868b7-wgmf6" Apr 22 20:00:54.207141 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:54.207052 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wgmf6" Apr 22 20:00:54.333894 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:54.333867 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wgmf6"] Apr 22 20:00:54.336321 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:00:54.336290 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf0fb12_754a_46f7_a133_7cb4a81e6bdb.slice/crio-93bcd97d7099b3c8c45760948720333463ff0da79eea1ed04f13198cc4d9e684 WatchSource:0}: Error finding container 93bcd97d7099b3c8c45760948720333463ff0da79eea1ed04f13198cc4d9e684: Status 404 returned error can't find the container with id 93bcd97d7099b3c8c45760948720333463ff0da79eea1ed04f13198cc4d9e684 Apr 22 20:00:54.506165 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:00:54.506129 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wgmf6" event={"ID":"cdf0fb12-754a-46f7-a133-7cb4a81e6bdb","Type":"ContainerStarted","Data":"93bcd97d7099b3c8c45760948720333463ff0da79eea1ed04f13198cc4d9e684"} Apr 22 20:01:03.197582 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.197543 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d667986fb-sl78h"] Apr 22 20:01:03.200071 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.200046 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.202839 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.202765 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ftd62\"" Apr 22 20:01:03.202839 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.202781 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 20:01:03.203052 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.202853 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 20:01:03.203052 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.202873 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 20:01:03.203052 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.202770 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 20:01:03.203920 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.203899 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 20:01:03.210148 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.209641 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d667986fb-sl78h"] Apr 22 20:01:03.236921 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.236889 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-service-ca\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.237099 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.236949 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-serving-cert\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.237099 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.237019 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-oauth-serving-cert\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.237099 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.237077 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ms2t\" (UniqueName: \"kubernetes.io/projected/d0236b56-bc0f-4668-8908-3d860145e0e2-kube-api-access-7ms2t\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.237278 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.237182 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-oauth-config\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.237278 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.237212 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-console-config\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.338050 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.337995 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-service-ca\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.338239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.338065 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-serving-cert\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.338239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.338130 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-oauth-serving-cert\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.338239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.338168 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ms2t\" (UniqueName: \"kubernetes.io/projected/d0236b56-bc0f-4668-8908-3d860145e0e2-kube-api-access-7ms2t\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.338239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.338225 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-oauth-config\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.338434 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.338266 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-console-config\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.338869 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.338839 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-service-ca\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.339144 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.338839 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-oauth-serving-cert\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.339144 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.338939 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-console-config\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.341193 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.341161 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-oauth-config\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.341373 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.341354 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-serving-cert\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.346389 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.346363 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ms2t\" (UniqueName: \"kubernetes.io/projected/d0236b56-bc0f-4668-8908-3d860145e0e2-kube-api-access-7ms2t\") pod \"console-5d667986fb-sl78h\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:03.513094 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:03.513055 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:09.818708 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:09.818682 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d667986fb-sl78h"] Apr 22 20:01:09.821463 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:01:09.821428 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0236b56_bc0f_4668_8908_3d860145e0e2.slice/crio-f4f619cbdae13cfb62219a66dfa7446a71c84a9da09745cd1876ddaca1367155 WatchSource:0}: Error finding container f4f619cbdae13cfb62219a66dfa7446a71c84a9da09745cd1876ddaca1367155: Status 404 returned error can't find the container with id f4f619cbdae13cfb62219a66dfa7446a71c84a9da09745cd1876ddaca1367155 Apr 22 20:01:10.558476 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:10.558418 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wgmf6" event={"ID":"cdf0fb12-754a-46f7-a133-7cb4a81e6bdb","Type":"ContainerStarted","Data":"24c594e7114393a982336773041a720d4bc3b2037d7690fce5fbcfcd1480f887"} Apr 22 20:01:10.559273 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:10.559224 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-wgmf6" Apr 22 20:01:10.561700 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:10.561667 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d667986fb-sl78h" event={"ID":"d0236b56-bc0f-4668-8908-3d860145e0e2","Type":"ContainerStarted","Data":"f4f619cbdae13cfb62219a66dfa7446a71c84a9da09745cd1876ddaca1367155"} Apr 22 20:01:10.571707 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:10.571675 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-wgmf6" Apr 22 20:01:10.576823 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:10.576752 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-wgmf6" podStartSLOduration=1.8346722610000001 podStartE2EDuration="17.57673452s" podCreationTimestamp="2026-04-22 20:00:53 +0000 UTC" firstStartedPulling="2026-04-22 20:00:54.338077444 +0000 UTC m=+201.926568756" lastFinishedPulling="2026-04-22 20:01:10.080139701 +0000 UTC m=+217.668631015" observedRunningTime="2026-04-22 20:01:10.575585321 +0000 UTC m=+218.164076679" watchObservedRunningTime="2026-04-22 20:01:10.57673452 +0000 UTC m=+218.165225845" Apr 22 20:01:12.583498 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.582978 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d9b44885b-vh2jf"] Apr 22 20:01:12.594890 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.594386 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d9b44885b-vh2jf"] Apr 22 20:01:12.594890 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.594537 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.604794 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.604376 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 20:01:12.727334 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.727241 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wsm\" (UniqueName: \"kubernetes.io/projected/e201cf51-7065-4c2f-8ec5-b64448319fea-kube-api-access-s7wsm\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.727508 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.727381 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-serving-cert\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.727508 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.727424 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-service-ca\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.727508 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.727471 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-oauth-serving-cert\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.727781 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.727521 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-oauth-config\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.727781 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.727555 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-console-config\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.727781 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.727591 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-trusted-ca-bundle\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.828455 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.828413 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-serving-cert\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.828645 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.828475 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-service-ca\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.828645 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.828528 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-oauth-serving-cert\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.828645 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.828588 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-oauth-config\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.828645 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.828621 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-console-config\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.828878 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.828664 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-trusted-ca-bundle\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.828878 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.828741 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wsm\" (UniqueName: \"kubernetes.io/projected/e201cf51-7065-4c2f-8ec5-b64448319fea-kube-api-access-s7wsm\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.830650 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.829957 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-service-ca\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.830650 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.830185 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-oauth-serving-cert\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.830650 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.830567 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-trusted-ca-bundle\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.830650 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.830608 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-console-config\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.833093 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.833055 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-serving-cert\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.833446 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.833424 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-oauth-config\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.838854 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.838794 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wsm\" (UniqueName: \"kubernetes.io/projected/e201cf51-7065-4c2f-8ec5-b64448319fea-kube-api-access-s7wsm\") pod \"console-6d9b44885b-vh2jf\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:12.908641 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:12.908598 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:13.098831 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:13.098738 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d9b44885b-vh2jf"] Apr 22 20:01:13.103409 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:01:13.103373 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode201cf51_7065_4c2f_8ec5_b64448319fea.slice/crio-fffd0a62a4e74936fa0764443c742658d4d7e927abb0d67536f4412782018a1f WatchSource:0}: Error finding container fffd0a62a4e74936fa0764443c742658d4d7e927abb0d67536f4412782018a1f: Status 404 returned error can't find the container with id fffd0a62a4e74936fa0764443c742658d4d7e927abb0d67536f4412782018a1f Apr 22 20:01:13.578561 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:13.578520 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d667986fb-sl78h" event={"ID":"d0236b56-bc0f-4668-8908-3d860145e0e2","Type":"ContainerStarted","Data":"3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61"} Apr 22 20:01:13.580382 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:13.580345 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9b44885b-vh2jf" event={"ID":"e201cf51-7065-4c2f-8ec5-b64448319fea","Type":"ContainerStarted","Data":"6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a"} Apr 22 20:01:13.580382 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:13.580385 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9b44885b-vh2jf" event={"ID":"e201cf51-7065-4c2f-8ec5-b64448319fea","Type":"ContainerStarted","Data":"fffd0a62a4e74936fa0764443c742658d4d7e927abb0d67536f4412782018a1f"} Apr 22 20:01:13.594736 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:13.594679 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d667986fb-sl78h" podStartSLOduration=7.122367876 podStartE2EDuration="10.594662657s" podCreationTimestamp="2026-04-22 20:01:03 +0000 UTC" firstStartedPulling="2026-04-22 20:01:09.823809645 +0000 UTC m=+217.412300951" lastFinishedPulling="2026-04-22 20:01:13.296104392 +0000 UTC m=+220.884595732" observedRunningTime="2026-04-22 20:01:13.594067037 +0000 UTC m=+221.182558366" watchObservedRunningTime="2026-04-22 20:01:13.594662657 +0000 UTC m=+221.183153987" Apr 22 20:01:13.609524 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:13.609432 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d9b44885b-vh2jf" podStartSLOduration=1.609412121 podStartE2EDuration="1.609412121s" podCreationTimestamp="2026-04-22 20:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:01:13.608456847 +0000 UTC m=+221.196948201" watchObservedRunningTime="2026-04-22 20:01:13.609412121 +0000 UTC m=+221.197903450" Apr 22 20:01:14.585159 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:14.585119 2548 generic.go:358] "Generic (PLEG): container finished" podID="33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd" containerID="a7c5c2cb4ba97aace3fe64e4bb04f2f85857b7720c69179da2684ece15cfa891" exitCode=0 Apr 22 20:01:14.585434 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:14.585201 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kq5tv" event={"ID":"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd","Type":"ContainerDied","Data":"a7c5c2cb4ba97aace3fe64e4bb04f2f85857b7720c69179da2684ece15cfa891"} Apr 22 20:01:14.585888 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:14.585860 2548 scope.go:117] "RemoveContainer" containerID="a7c5c2cb4ba97aace3fe64e4bb04f2f85857b7720c69179da2684ece15cfa891" Apr 22 20:01:15.589894 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:15.589859 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kq5tv" event={"ID":"33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd","Type":"ContainerStarted","Data":"997565a2adebcadae0fdd722a87dc58beceb9fb37ef96dc6a4476d6376fa3454"} Apr 22 20:01:16.087641 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:16.087607 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d1551003-1ea5-4cab-a68c-6d06d126f177/init-config-reloader/0.log" Apr 22 20:01:16.287692 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:16.287657 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d1551003-1ea5-4cab-a68c-6d06d126f177/alertmanager/0.log" Apr 22 20:01:16.487820 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:16.487788 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d1551003-1ea5-4cab-a68c-6d06d126f177/config-reloader/0.log" Apr 22 20:01:16.688393 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:16.688363 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d1551003-1ea5-4cab-a68c-6d06d126f177/kube-rbac-proxy-web/0.log" Apr 22 20:01:16.888051 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:16.887970 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d1551003-1ea5-4cab-a68c-6d06d126f177/kube-rbac-proxy/0.log" Apr 22 20:01:17.087838 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:17.087806 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d1551003-1ea5-4cab-a68c-6d06d126f177/kube-rbac-proxy-metric/0.log" Apr 22 20:01:17.287694 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:17.287669 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d1551003-1ea5-4cab-a68c-6d06d126f177/prom-label-proxy/0.log" Apr 22 20:01:19.287499 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:19.287466 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l4m6g_f2f0b7a0-1bba-4840-81dd-1944c681644b/init-textfile/0.log" Apr 22 20:01:19.488380 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:19.488345 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l4m6g_f2f0b7a0-1bba-4840-81dd-1944c681644b/node-exporter/0.log" Apr 22 20:01:19.688203 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:19.688097 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l4m6g_f2f0b7a0-1bba-4840-81dd-1944c681644b/kube-rbac-proxy/0.log" Apr 22 20:01:20.607963 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:20.607930 2548 generic.go:358] "Generic (PLEG): container finished" podID="4e3f2c01-6cd4-497c-88c9-926d537a876c" containerID="0b0a65b13a8f0a0053aac27b862b087eb817873cf0168960219c2581d15ace3f" exitCode=0 Apr 22 20:01:20.608368 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:20.608008 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" event={"ID":"4e3f2c01-6cd4-497c-88c9-926d537a876c","Type":"ContainerDied","Data":"0b0a65b13a8f0a0053aac27b862b087eb817873cf0168960219c2581d15ace3f"} Apr 22 20:01:20.608414 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:20.608385 2548 scope.go:117] "RemoveContainer" containerID="0b0a65b13a8f0a0053aac27b862b087eb817873cf0168960219c2581d15ace3f" Apr 22 20:01:21.612770 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:21.612722 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f5jgc" event={"ID":"4e3f2c01-6cd4-497c-88c9-926d537a876c","Type":"ContainerStarted","Data":"a3e015d34535953dbdde7e9851be93b3a19d16c9f5b89ae30daccbcad2520f9b"} Apr 22 20:01:22.908835 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:22.908792 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:22.908835 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:22.908841 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:22.913669 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:22.913646 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:23.514137 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:23.514102 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:23.514344 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:23.514152 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:23.518944 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:23.518920 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:23.622982 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:23.622951 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:23.623396 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:23.623377 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:01:23.689143 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:23.689113 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d667986fb-sl78h"] Apr 22 20:01:24.888133 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:24.888101 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d667986fb-sl78h_d0236b56-bc0f-4668-8908-3d860145e0e2/console/0.log" Apr 22 20:01:25.087327 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:25.087304 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d9b44885b-vh2jf_e201cf51-7065-4c2f-8ec5-b64448319fea/console/0.log" Apr 22 20:01:25.289039 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:25.289010 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-wgmf6_cdf0fb12-754a-46f7-a133-7cb4a81e6bdb/download-server/0.log" Apr 22 20:01:27.087348 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:27.087320 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tscxv_e37b1adb-be11-4c0a-beea-dbf70b8cda38/dns-node-resolver/0.log" Apr 22 20:01:35.655075 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:35.655040 2548 generic.go:358] "Generic (PLEG): container finished" podID="27321596-248a-4a3c-b6c9-64b406655f9f" containerID="63842cb37eb26f11720a1df0b017b407d87a0f433483f7276a64d378cfc79e9c" exitCode=0 Apr 22 20:01:35.655487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:35.655107 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" event={"ID":"27321596-248a-4a3c-b6c9-64b406655f9f","Type":"ContainerDied","Data":"63842cb37eb26f11720a1df0b017b407d87a0f433483f7276a64d378cfc79e9c"} Apr 22 20:01:35.655487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:35.655419 2548 scope.go:117] "RemoveContainer" containerID="63842cb37eb26f11720a1df0b017b407d87a0f433483f7276a64d378cfc79e9c" Apr 22 20:01:36.659688 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:36.659654 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lth8q" event={"ID":"27321596-248a-4a3c-b6c9-64b406655f9f","Type":"ContainerStarted","Data":"c43c8d62396b562a8f226704f61bbe66714ba6c986b5badf567de6b56bc01282"} Apr 22 20:01:44.822634 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:44.822584 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 20:01:44.825069 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:44.825032 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/450a901e-1810-4879-8bc6-97efb2b1c9d9-metrics-certs\") pod \"network-metrics-daemon-jjztz\" (UID: \"450a901e-1810-4879-8bc6-97efb2b1c9d9\") " pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 20:01:45.051088 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:45.051051 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-79t2x\"" Apr 22 20:01:45.059192 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:45.059164 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjztz" Apr 22 20:01:45.181570 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:45.181541 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jjztz"] Apr 22 20:01:45.185541 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:01:45.185513 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod450a901e_1810_4879_8bc6_97efb2b1c9d9.slice/crio-5b2bdfa11168743223a93bc65af9f38be34e7fd482e8d5515833bd87657478d6 WatchSource:0}: Error finding container 5b2bdfa11168743223a93bc65af9f38be34e7fd482e8d5515833bd87657478d6: Status 404 returned error can't find the container with id 5b2bdfa11168743223a93bc65af9f38be34e7fd482e8d5515833bd87657478d6 Apr 22 20:01:45.686829 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:45.686735 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jjztz" event={"ID":"450a901e-1810-4879-8bc6-97efb2b1c9d9","Type":"ContainerStarted","Data":"5b2bdfa11168743223a93bc65af9f38be34e7fd482e8d5515833bd87657478d6"} Apr 22 20:01:47.693986 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:47.693949 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jjztz" event={"ID":"450a901e-1810-4879-8bc6-97efb2b1c9d9","Type":"ContainerStarted","Data":"d28612d1cd7917d95a21cc63d9f74b6ce787d113f48c55d2bae52189aea23963"} Apr 22 20:01:47.693986 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:47.693987 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jjztz" event={"ID":"450a901e-1810-4879-8bc6-97efb2b1c9d9","Type":"ContainerStarted","Data":"69d13c37dd42ab0020d706b68c52a3d8dc9a006f4a29570c99120228f5fe9ea6"} Apr 22 20:01:47.712353 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:47.712304 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jjztz" podStartSLOduration=253.237134677 podStartE2EDuration="4m14.712288529s" podCreationTimestamp="2026-04-22 19:57:33 +0000 UTC" firstStartedPulling="2026-04-22 20:01:45.187491877 +0000 UTC m=+252.775983187" lastFinishedPulling="2026-04-22 20:01:46.662645731 +0000 UTC m=+254.251137039" observedRunningTime="2026-04-22 20:01:47.710585782 +0000 UTC m=+255.299077123" watchObservedRunningTime="2026-04-22 20:01:47.712288529 +0000 UTC m=+255.300779858" Apr 22 20:01:50.643383 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.643313 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d667986fb-sl78h" podUID="d0236b56-bc0f-4668-8908-3d860145e0e2" containerName="console" containerID="cri-o://3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61" gracePeriod=15 Apr 22 20:01:50.930361 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.930337 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d667986fb-sl78h_d0236b56-bc0f-4668-8908-3d860145e0e2/console/0.log" Apr 22 20:01:50.930476 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.930397 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:50.977078 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977051 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-console-config\") pod \"d0236b56-bc0f-4668-8908-3d860145e0e2\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " Apr 22 20:01:50.977239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977102 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-service-ca\") pod \"d0236b56-bc0f-4668-8908-3d860145e0e2\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " Apr 22 20:01:50.977239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977147 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-oauth-serving-cert\") pod \"d0236b56-bc0f-4668-8908-3d860145e0e2\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " Apr 22 20:01:50.977239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977182 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-serving-cert\") pod \"d0236b56-bc0f-4668-8908-3d860145e0e2\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " Apr 22 20:01:50.977378 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977267 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ms2t\" (UniqueName: \"kubernetes.io/projected/d0236b56-bc0f-4668-8908-3d860145e0e2-kube-api-access-7ms2t\") pod \"d0236b56-bc0f-4668-8908-3d860145e0e2\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " Apr 22 20:01:50.977378 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977300 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-oauth-config\") pod \"d0236b56-bc0f-4668-8908-3d860145e0e2\" (UID: \"d0236b56-bc0f-4668-8908-3d860145e0e2\") " Apr 22 20:01:50.977584 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977553 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-console-config" (OuterVolumeSpecName: "console-config") pod "d0236b56-bc0f-4668-8908-3d860145e0e2" (UID: "d0236b56-bc0f-4668-8908-3d860145e0e2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:50.977584 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977565 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d0236b56-bc0f-4668-8908-3d860145e0e2" (UID: "d0236b56-bc0f-4668-8908-3d860145e0e2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:50.977584 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.977582 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-service-ca" (OuterVolumeSpecName: "service-ca") pod "d0236b56-bc0f-4668-8908-3d860145e0e2" (UID: "d0236b56-bc0f-4668-8908-3d860145e0e2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:50.979651 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.979625 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0236b56-bc0f-4668-8908-3d860145e0e2-kube-api-access-7ms2t" (OuterVolumeSpecName: "kube-api-access-7ms2t") pod "d0236b56-bc0f-4668-8908-3d860145e0e2" (UID: "d0236b56-bc0f-4668-8908-3d860145e0e2"). InnerVolumeSpecName "kube-api-access-7ms2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:50.979767 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.979744 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d0236b56-bc0f-4668-8908-3d860145e0e2" (UID: "d0236b56-bc0f-4668-8908-3d860145e0e2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:50.979835 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:50.979794 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d0236b56-bc0f-4668-8908-3d860145e0e2" (UID: "d0236b56-bc0f-4668-8908-3d860145e0e2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:51.078356 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.078318 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7ms2t\" (UniqueName: \"kubernetes.io/projected/d0236b56-bc0f-4668-8908-3d860145e0e2-kube-api-access-7ms2t\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:51.078356 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.078349 2548 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-oauth-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:51.078356 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.078360 2548 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-console-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:51.078356 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.078371 2548 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-service-ca\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:51.078626 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.078379 2548 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0236b56-bc0f-4668-8908-3d860145e0e2-oauth-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:51.078626 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.078388 2548 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0236b56-bc0f-4668-8908-3d860145e0e2-console-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:51.705611 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.705577 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d667986fb-sl78h_d0236b56-bc0f-4668-8908-3d860145e0e2/console/0.log" Apr 22 20:01:51.706045 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.705623 2548 generic.go:358] "Generic (PLEG): container finished" podID="d0236b56-bc0f-4668-8908-3d860145e0e2" containerID="3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61" exitCode=2 Apr 22 20:01:51.706045 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.705719 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d667986fb-sl78h" Apr 22 20:01:51.706045 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.705719 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d667986fb-sl78h" event={"ID":"d0236b56-bc0f-4668-8908-3d860145e0e2","Type":"ContainerDied","Data":"3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61"} Apr 22 20:01:51.706045 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.705760 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d667986fb-sl78h" event={"ID":"d0236b56-bc0f-4668-8908-3d860145e0e2","Type":"ContainerDied","Data":"f4f619cbdae13cfb62219a66dfa7446a71c84a9da09745cd1876ddaca1367155"} Apr 22 20:01:51.706045 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.705775 2548 scope.go:117] "RemoveContainer" containerID="3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61" Apr 22 20:01:51.713930 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.713912 2548 scope.go:117] "RemoveContainer" containerID="3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61" Apr 22 20:01:51.714241 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:01:51.714221 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61\": container with ID starting with 3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61 not found: ID does not exist" containerID="3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61" Apr 22 20:01:51.714316 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.714270 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61"} err="failed to get container status \"3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61\": rpc error: code = NotFound desc = could not find container \"3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61\": container with ID starting with 3d3128d19aa9c12eb0d0854a12d4a23bc406a5308a439c467003279cbedede61 not found: ID does not exist" Apr 22 20:01:51.728366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.728340 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d667986fb-sl78h"] Apr 22 20:01:51.730925 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:51.730900 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d667986fb-sl78h"] Apr 22 20:01:52.951228 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:52.951196 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0236b56-bc0f-4668-8908-3d860145e0e2" path="/var/lib/kubelet/pods/d0236b56-bc0f-4668-8908-3d860145e0e2/volumes" Apr 22 20:01:53.993850 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:53.993814 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:53.994306 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:53.994277 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="alertmanager" containerID="cri-o://5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c" gracePeriod=120 Apr 22 20:01:53.994457 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:53.994314 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy-metric" containerID="cri-o://6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556" gracePeriod=120 Apr 22 20:01:53.994457 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:53.994346 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy-web" containerID="cri-o://1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5" gracePeriod=120 Apr 22 20:01:53.994457 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:53.994348 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="config-reloader" containerID="cri-o://934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13" gracePeriod=120 Apr 22 20:01:53.994457 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:53.994381 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="prom-label-proxy" containerID="cri-o://779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488" gracePeriod=120 Apr 22 20:01:53.994457 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:53.994426 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy" containerID="cri-o://09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d" gracePeriod=120 Apr 22 20:01:54.718750 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:54.718714 2548 generic.go:358] "Generic (PLEG): container finished" podID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerID="779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488" exitCode=0 Apr 22 20:01:54.718750 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:54.718740 2548 generic.go:358] "Generic (PLEG): container finished" podID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerID="09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d" exitCode=0 Apr 22 20:01:54.718750 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:54.718750 2548 generic.go:358] "Generic (PLEG): container finished" podID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerID="934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13" exitCode=0 Apr 22 20:01:54.718750 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:54.718759 2548 generic.go:358] "Generic (PLEG): container finished" podID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerID="5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c" exitCode=0 Apr 22 20:01:54.718998 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:54.718781 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerDied","Data":"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488"} Apr 22 20:01:54.718998 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:54.718811 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerDied","Data":"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d"} Apr 22 20:01:54.718998 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:54.718824 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerDied","Data":"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13"} Apr 22 20:01:54.718998 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:54.718832 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerDied","Data":"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c"} Apr 22 20:01:55.249487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.249461 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.312526 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312431 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-main-tls\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312526 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312475 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312526 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312498 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-main-db\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312526 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312526 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-config-volume\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312553 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-trusted-ca-bundle\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312574 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-tls-assets\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312598 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-web\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312626 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-cluster-tls-config\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312675 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tszfl\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-kube-api-access-tszfl\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312695 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-web-config\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312725 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-config-out\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312746 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.312922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.312770 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-metrics-client-ca\") pod \"d1551003-1ea5-4cab-a68c-6d06d126f177\" (UID: \"d1551003-1ea5-4cab-a68c-6d06d126f177\") " Apr 22 20:01:55.313523 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.313343 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:01:55.313523 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.313423 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:55.314107 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.313995 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:55.316341 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.316287 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-config-out" (OuterVolumeSpecName: "config-out") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:01:55.316566 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.316520 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-config-volume" (OuterVolumeSpecName: "config-volume") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:55.317027 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.316993 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:55.317829 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.317800 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:55.318017 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.317996 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-kube-api-access-tszfl" (OuterVolumeSpecName: "kube-api-access-tszfl") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "kube-api-access-tszfl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:55.318187 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.318164 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:55.318505 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.318473 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:55.318635 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.318614 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:55.320662 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.320626 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:55.328480 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.328444 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-web-config" (OuterVolumeSpecName: "web-config") pod "d1551003-1ea5-4cab-a68c-6d06d126f177" (UID: "d1551003-1ea5-4cab-a68c-6d06d126f177"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:55.413763 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413724 2548 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-metrics-client-ca\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413763 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413756 2548 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-main-tls\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413763 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413766 2548 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413777 2548 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-main-db\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413787 2548 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-config-volume\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413796 2548 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1551003-1ea5-4cab-a68c-6d06d126f177-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413806 2548 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-tls-assets\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413815 2548 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413823 2548 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-cluster-tls-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413832 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tszfl\" (UniqueName: \"kubernetes.io/projected/d1551003-1ea5-4cab-a68c-6d06d126f177-kube-api-access-tszfl\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413841 2548 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-web-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413848 2548 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1551003-1ea5-4cab-a68c-6d06d126f177-config-out\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.413988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.413857 2548 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d1551003-1ea5-4cab-a68c-6d06d126f177-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:01:55.724749 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.724722 2548 generic.go:358] "Generic (PLEG): container finished" podID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerID="6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556" exitCode=0 Apr 22 20:01:55.724749 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.724750 2548 generic.go:358] "Generic (PLEG): container finished" podID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerID="1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5" exitCode=0 Apr 22 20:01:55.724930 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.724771 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerDied","Data":"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556"} Apr 22 20:01:55.724930 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.724799 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerDied","Data":"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5"} Apr 22 20:01:55.724930 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.724809 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d1551003-1ea5-4cab-a68c-6d06d126f177","Type":"ContainerDied","Data":"3fff5f8bc9bca18e39286ade52e70ad2d3a2c9c2a8c5e9d4427bc3223d523747"} Apr 22 20:01:55.724930 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.724825 2548 scope.go:117] "RemoveContainer" containerID="779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488" Apr 22 20:01:55.724930 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.724871 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.739213 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.739193 2548 scope.go:117] "RemoveContainer" containerID="6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556" Apr 22 20:01:55.745966 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.745944 2548 scope.go:117] "RemoveContainer" containerID="09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d" Apr 22 20:01:55.747174 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.747159 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:55.751276 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.751234 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:55.753722 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.753704 2548 scope.go:117] "RemoveContainer" containerID="1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5" Apr 22 20:01:55.760207 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.760187 2548 scope.go:117] "RemoveContainer" containerID="934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13" Apr 22 20:01:55.766833 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.766814 2548 scope.go:117] "RemoveContainer" containerID="5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c" Apr 22 20:01:55.773421 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.773403 2548 scope.go:117] "RemoveContainer" containerID="43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014" Apr 22 20:01:55.778397 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778378 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:55.778661 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778649 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy" Apr 22 20:01:55.778712 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778662 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy" Apr 22 20:01:55.778712 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778674 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy-metric" Apr 22 20:01:55.778712 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778679 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy-metric" Apr 22 20:01:55.778712 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778688 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="config-reloader" Apr 22 20:01:55.778712 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778693 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="config-reloader" Apr 22 20:01:55.778712 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778701 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="init-config-reloader" Apr 22 20:01:55.778712 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778707 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="init-config-reloader" Apr 22 20:01:55.778712 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778712 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0236b56-bc0f-4668-8908-3d860145e0e2" containerName="console" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778718 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0236b56-bc0f-4668-8908-3d860145e0e2" containerName="console" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778727 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="alertmanager" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778732 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="alertmanager" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778737 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy-web" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778742 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy-web" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778747 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="prom-label-proxy" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778751 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="prom-label-proxy" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778795 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy-web" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778803 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy-metric" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778810 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="alertmanager" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778815 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="config-reloader" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778820 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="kube-rbac-proxy" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778827 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" containerName="prom-label-proxy" Apr 22 20:01:55.778975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.778836 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0236b56-bc0f-4668-8908-3d860145e0e2" containerName="console" Apr 22 20:01:55.780532 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.780514 2548 scope.go:117] "RemoveContainer" containerID="779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488" Apr 22 20:01:55.780783 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:01:55.780763 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488\": container with ID starting with 779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488 not found: ID does not exist" containerID="779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488" Apr 22 20:01:55.780847 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.780793 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488"} err="failed to get container status \"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488\": rpc error: code = NotFound desc = could not find container \"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488\": container with ID starting with 779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488 not found: ID does not exist" Apr 22 20:01:55.780847 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.780811 2548 scope.go:117] "RemoveContainer" containerID="6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556" Apr 22 20:01:55.781000 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:01:55.780985 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556\": container with ID starting with 6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556 not found: ID does not exist" containerID="6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556" Apr 22 20:01:55.781040 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781005 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556"} err="failed to get container status \"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556\": rpc error: code = NotFound desc = could not find container \"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556\": container with ID starting with 6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556 not found: ID does not exist" Apr 22 20:01:55.781040 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781016 2548 scope.go:117] "RemoveContainer" containerID="09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d" Apr 22 20:01:55.781236 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:01:55.781222 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d\": container with ID starting with 09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d not found: ID does not exist" containerID="09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d" Apr 22 20:01:55.781309 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781240 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d"} err="failed to get container status \"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d\": rpc error: code = NotFound desc = could not find container \"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d\": container with ID starting with 09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d not found: ID does not exist" Apr 22 20:01:55.781309 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781270 2548 scope.go:117] "RemoveContainer" containerID="1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5" Apr 22 20:01:55.781460 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:01:55.781441 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5\": container with ID starting with 1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5 not found: ID does not exist" containerID="1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5" Apr 22 20:01:55.781523 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781470 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5"} err="failed to get container status \"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5\": rpc error: code = NotFound desc = could not find container \"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5\": container with ID starting with 1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5 not found: ID does not exist" Apr 22 20:01:55.781523 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781493 2548 scope.go:117] "RemoveContainer" containerID="934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13" Apr 22 20:01:55.781710 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:01:55.781695 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13\": container with ID starting with 934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13 not found: ID does not exist" containerID="934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13" Apr 22 20:01:55.781747 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781712 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13"} err="failed to get container status \"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13\": rpc error: code = NotFound desc = could not find container \"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13\": container with ID starting with 934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13 not found: ID does not exist" Apr 22 20:01:55.781747 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781724 2548 scope.go:117] "RemoveContainer" containerID="5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c" Apr 22 20:01:55.781894 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:01:55.781879 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c\": container with ID starting with 5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c not found: ID does not exist" containerID="5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c" Apr 22 20:01:55.781935 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781897 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c"} err="failed to get container status \"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c\": rpc error: code = NotFound desc = could not find container \"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c\": container with ID starting with 5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c not found: ID does not exist" Apr 22 20:01:55.781935 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.781908 2548 scope.go:117] "RemoveContainer" containerID="43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014" Apr 22 20:01:55.782075 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:01:55.782059 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014\": container with ID starting with 43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014 not found: ID does not exist" containerID="43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014" Apr 22 20:01:55.782111 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782077 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014"} err="failed to get container status \"43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014\": rpc error: code = NotFound desc = could not find container \"43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014\": container with ID starting with 43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014 not found: ID does not exist" Apr 22 20:01:55.782111 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782088 2548 scope.go:117] "RemoveContainer" containerID="779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488" Apr 22 20:01:55.782321 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782291 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488"} err="failed to get container status \"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488\": rpc error: code = NotFound desc = could not find container \"779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488\": container with ID starting with 779ecaf733cadb86863f0bb575b34afff3059ab9389d3479b068317a95d5d488 not found: ID does not exist" Apr 22 20:01:55.782370 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782326 2548 scope.go:117] "RemoveContainer" containerID="6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556" Apr 22 20:01:55.782558 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782542 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556"} err="failed to get container status \"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556\": rpc error: code = NotFound desc = could not find container \"6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556\": container with ID starting with 6128ff47f9a2a012b4fa8014ffbb398bcb950c89f9b710e41148a800ee8f6556 not found: ID does not exist" Apr 22 20:01:55.782558 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782557 2548 scope.go:117] "RemoveContainer" containerID="09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d" Apr 22 20:01:55.782785 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782765 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d"} err="failed to get container status \"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d\": rpc error: code = NotFound desc = could not find container \"09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d\": container with ID starting with 09422d010e17a06939427c024a69e5acaedd27171c68957b963d8c78e44f7b8d not found: ID does not exist" Apr 22 20:01:55.782829 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782786 2548 scope.go:117] "RemoveContainer" containerID="1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5" Apr 22 20:01:55.783013 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.782992 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5"} err="failed to get container status \"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5\": rpc error: code = NotFound desc = could not find container \"1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5\": container with ID starting with 1fe9d19f3ab2676353d589114bed4588bfc3f17c3acb8edb77ada2a6e896aca5 not found: ID does not exist" Apr 22 20:01:55.783062 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.783016 2548 scope.go:117] "RemoveContainer" containerID="934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13" Apr 22 20:01:55.783224 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.783207 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13"} err="failed to get container status \"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13\": rpc error: code = NotFound desc = could not find container \"934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13\": container with ID starting with 934697eca938444799f0f418c899ace4449c6d7c98e0681c8435c0f398625e13 not found: ID does not exist" Apr 22 20:01:55.783290 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.783224 2548 scope.go:117] "RemoveContainer" containerID="5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c" Apr 22 20:01:55.783465 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.783444 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c"} err="failed to get container status \"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c\": rpc error: code = NotFound desc = could not find container \"5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c\": container with ID starting with 5451bdd1a29ff1eb6fc3ef94de73a269c946b597130fb0f3b095716986ac764c not found: ID does not exist" Apr 22 20:01:55.783509 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.783467 2548 scope.go:117] "RemoveContainer" containerID="43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014" Apr 22 20:01:55.783649 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.783635 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014"} err="failed to get container status \"43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014\": rpc error: code = NotFound desc = could not find container \"43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014\": container with ID starting with 43ea9a353b7d6e802f5ee0cba11aad8003d306c8690dc8e35169156da4940014 not found: ID does not exist" Apr 22 20:01:55.808662 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.808633 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:55.808774 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.808690 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.811556 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.811536 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 20:01:55.811688 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.811536 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-967ml\"" Apr 22 20:01:55.811688 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.811607 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 20:01:55.811830 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.811812 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 20:01:55.811908 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.811813 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 20:01:55.812071 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.812056 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 20:01:55.812459 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.812440 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 20:01:55.812562 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.812446 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 20:01:55.812562 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.812446 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 20:01:55.816416 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.816400 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 20:01:55.917205 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917169 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9x6n\" (UniqueName: \"kubernetes.io/projected/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-kube-api-access-m9x6n\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917205 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917213 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917436 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917330 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-config-out\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917436 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917364 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917436 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917384 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-web-config\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917436 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917422 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917557 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917445 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917557 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917468 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917557 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917488 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917557 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917505 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917557 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917520 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917557 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917534 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:55.917557 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:55.917552 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018527 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018436 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018527 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018480 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018527 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018515 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018543 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018567 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018590 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018614 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018695 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9x6n\" (UniqueName: \"kubernetes.io/projected/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-kube-api-access-m9x6n\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018753 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018828 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-config-out\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018869 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.018928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018907 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-web-config\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.019867 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.018948 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.019867 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.019057 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.020423 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.020396 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.021438 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.021404 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.021650 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.021631 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.021722 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.021642 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-config-out\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.022101 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.022071 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.022188 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.022156 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.022188 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.022177 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.022471 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.022443 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.023370 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.023350 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.023646 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.023626 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.023980 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.023966 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-web-config\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.028672 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.028648 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9x6n\" (UniqueName: \"kubernetes.io/projected/1cb0d0fc-e57f-448b-8881-87ee9514f5cf-kube-api-access-m9x6n\") pod \"alertmanager-main-0\" (UID: \"1cb0d0fc-e57f-448b-8881-87ee9514f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.117720 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.117686 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:56.245275 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.245212 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:56.248695 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:01:56.248669 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb0d0fc_e57f_448b_8881_87ee9514f5cf.slice/crio-c1cbb7e5637ce30a71258143cb76a41e69c3ed144a9d032bd15206a4abaccf53 WatchSource:0}: Error finding container c1cbb7e5637ce30a71258143cb76a41e69c3ed144a9d032bd15206a4abaccf53: Status 404 returned error can't find the container with id c1cbb7e5637ce30a71258143cb76a41e69c3ed144a9d032bd15206a4abaccf53 Apr 22 20:01:56.728498 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.728466 2548 generic.go:358] "Generic (PLEG): container finished" podID="1cb0d0fc-e57f-448b-8881-87ee9514f5cf" containerID="3e8ea6327e4ae4831f669788078bd9e6ee49aebb755d557152ee2f00d2eaaba9" exitCode=0 Apr 22 20:01:56.728966 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.728554 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1cb0d0fc-e57f-448b-8881-87ee9514f5cf","Type":"ContainerDied","Data":"3e8ea6327e4ae4831f669788078bd9e6ee49aebb755d557152ee2f00d2eaaba9"} Apr 22 20:01:56.728966 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.728596 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1cb0d0fc-e57f-448b-8881-87ee9514f5cf","Type":"ContainerStarted","Data":"c1cbb7e5637ce30a71258143cb76a41e69c3ed144a9d032bd15206a4abaccf53"} Apr 22 20:01:56.955342 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:56.955027 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1551003-1ea5-4cab-a68c-6d06d126f177" path="/var/lib/kubelet/pods/d1551003-1ea5-4cab-a68c-6d06d126f177/volumes" Apr 22 20:01:57.736009 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:57.735976 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1cb0d0fc-e57f-448b-8881-87ee9514f5cf","Type":"ContainerStarted","Data":"dc0aa7388b6529c9306326b9297185be1d8d2e38da751cb5e21139c864c4042e"} Apr 22 20:01:57.736009 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:57.736011 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1cb0d0fc-e57f-448b-8881-87ee9514f5cf","Type":"ContainerStarted","Data":"5c0ea4f7a6e3775153356c7e7d5ba2fd2d5c9d20036e67098b7acfbc98ca0a10"} Apr 22 20:01:57.736009 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:57.736020 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1cb0d0fc-e57f-448b-8881-87ee9514f5cf","Type":"ContainerStarted","Data":"83cb5ad5aecbe69d81a395871bd097f548925c4d7afd2415c86cfb5f218bebf9"} Apr 22 20:01:57.736009 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:57.736027 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1cb0d0fc-e57f-448b-8881-87ee9514f5cf","Type":"ContainerStarted","Data":"1ec927f26a1bfa52d074c52b63f243a2be41c177ed53b36719583c2710ad2568"} Apr 22 20:01:57.736624 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:57.736035 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1cb0d0fc-e57f-448b-8881-87ee9514f5cf","Type":"ContainerStarted","Data":"cd0a516e33bd879a8c4d50a3cff4712fe8e4796c12fa674cafe81823da4680d8"} Apr 22 20:01:57.736624 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:57.736043 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1cb0d0fc-e57f-448b-8881-87ee9514f5cf","Type":"ContainerStarted","Data":"f7ee27b5c4fc298823e7197709e49a8aedbd0727418928c7f5e338203a7b18dd"} Apr 22 20:01:57.762924 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:57.762859 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.762841042 podStartE2EDuration="2.762841042s" podCreationTimestamp="2026-04-22 20:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:01:57.761293731 +0000 UTC m=+265.349785060" watchObservedRunningTime="2026-04-22 20:01:57.762841042 +0000 UTC m=+265.351332378" Apr 22 20:01:58.016579 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.016541 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-56b7f58d88-62kzz"] Apr 22 20:01:58.058902 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.058873 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56b7f58d88-62kzz"] Apr 22 20:01:58.059087 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.059005 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.061973 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.061949 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-6m6l6\"" Apr 22 20:01:58.061973 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.061969 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 20:01:58.062164 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.061979 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 20:01:58.062164 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.061952 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 20:01:58.062321 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.062306 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 20:01:58.062401 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.062387 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 20:01:58.067352 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.067304 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 20:01:58.135792 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.135756 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-serving-certs-ca-bundle\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.135967 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.135808 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-federate-client-tls\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.135967 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.135878 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.135967 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.135946 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-telemeter-client-tls\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.136083 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.135986 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-secret-telemeter-client\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.136083 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.136003 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.136151 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.136083 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6cvn\" (UniqueName: \"kubernetes.io/projected/2fff136a-97e6-4ad1-837f-9941016a24d3-kube-api-access-f6cvn\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.136151 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.136126 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-metrics-client-ca\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.236591 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.236554 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-federate-client-tls\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.236738 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.236596 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.236738 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.236725 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-telemeter-client-tls\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.236832 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.236790 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-secret-telemeter-client\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.236832 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.236813 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.236899 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.236845 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6cvn\" (UniqueName: \"kubernetes.io/projected/2fff136a-97e6-4ad1-837f-9941016a24d3-kube-api-access-f6cvn\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.236899 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.236879 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-metrics-client-ca\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.236992 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.236954 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-serving-certs-ca-bundle\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.237642 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.237613 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-serving-certs-ca-bundle\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.237870 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.237823 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.237997 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.237974 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fff136a-97e6-4ad1-837f-9941016a24d3-metrics-client-ca\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.239495 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.239468 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-secret-telemeter-client\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.239579 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.239540 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.239636 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.239622 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-federate-client-tls\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.239673 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.239634 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fff136a-97e6-4ad1-837f-9941016a24d3-telemeter-client-tls\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.244275 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.244238 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6cvn\" (UniqueName: \"kubernetes.io/projected/2fff136a-97e6-4ad1-837f-9941016a24d3-kube-api-access-f6cvn\") pod \"telemeter-client-56b7f58d88-62kzz\" (UID: \"2fff136a-97e6-4ad1-837f-9941016a24d3\") " pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.369543 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.369453 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" Apr 22 20:01:58.497114 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.497079 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56b7f58d88-62kzz"] Apr 22 20:01:58.500918 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:01:58.500891 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fff136a_97e6_4ad1_837f_9941016a24d3.slice/crio-83f2badb3a97df661ed4b07677f9fcb17a7f920e2a019390ba3674f5ea0d6403 WatchSource:0}: Error finding container 83f2badb3a97df661ed4b07677f9fcb17a7f920e2a019390ba3674f5ea0d6403: Status 404 returned error can't find the container with id 83f2badb3a97df661ed4b07677f9fcb17a7f920e2a019390ba3674f5ea0d6403 Apr 22 20:01:58.739767 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:01:58.739730 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" event={"ID":"2fff136a-97e6-4ad1-837f-9941016a24d3","Type":"ContainerStarted","Data":"83f2badb3a97df661ed4b07677f9fcb17a7f920e2a019390ba3674f5ea0d6403"} Apr 22 20:02:00.747164 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:00.747133 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" event={"ID":"2fff136a-97e6-4ad1-837f-9941016a24d3","Type":"ContainerStarted","Data":"910fbb25adde5879508accffe0c46f913e2d9d36daf96177656a94b8765088f8"} Apr 22 20:02:01.752170 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:01.752129 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" event={"ID":"2fff136a-97e6-4ad1-837f-9941016a24d3","Type":"ContainerStarted","Data":"c64976c07d398123c7c623d2c6abe6435d69c77f4ee0fb9e11023d180b5aeefc"} Apr 22 20:02:01.752170 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:01.752171 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" event={"ID":"2fff136a-97e6-4ad1-837f-9941016a24d3","Type":"ContainerStarted","Data":"415b306af4c7ee1db43c897205d852556cce22be7d7146a4a9ebd519d7a95291"} Apr 22 20:02:01.773657 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:01.773609 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-56b7f58d88-62kzz" podStartSLOduration=1.639530302 podStartE2EDuration="3.773590932s" podCreationTimestamp="2026-04-22 20:01:58 +0000 UTC" firstStartedPulling="2026-04-22 20:01:58.503095895 +0000 UTC m=+266.091587205" lastFinishedPulling="2026-04-22 20:02:00.637156513 +0000 UTC m=+268.225647835" observedRunningTime="2026-04-22 20:02:01.771037683 +0000 UTC m=+269.359529012" watchObservedRunningTime="2026-04-22 20:02:01.773590932 +0000 UTC m=+269.362082282" Apr 22 20:02:02.561337 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.561307 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-747f9454dd-mvkf8"] Apr 22 20:02:02.563631 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.563612 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.576918 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.576890 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-747f9454dd-mvkf8"] Apr 22 20:02:02.676446 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.676409 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s5wt\" (UniqueName: \"kubernetes.io/projected/7bac8959-343b-4946-a09f-7b7ed1e5a7de-kube-api-access-4s5wt\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.676446 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.676456 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-serving-cert\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.676671 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.676508 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-config\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.676671 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.676533 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-trusted-ca-bundle\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.676671 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.676555 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-service-ca\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.676671 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.676591 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-oauth-config\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.676671 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.676623 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-oauth-serving-cert\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.777149 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777119 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-oauth-serving-cert\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.777592 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777182 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s5wt\" (UniqueName: \"kubernetes.io/projected/7bac8959-343b-4946-a09f-7b7ed1e5a7de-kube-api-access-4s5wt\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.777592 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777207 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-serving-cert\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.777592 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777230 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-config\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.777592 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777296 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-trusted-ca-bundle\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.777592 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777339 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-service-ca\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.777592 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777369 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-oauth-config\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.778015 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777990 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-config\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.778115 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.777990 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-oauth-serving-cert\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.778115 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.778049 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-service-ca\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.778213 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.778175 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-trusted-ca-bundle\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.779905 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.779883 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-serving-cert\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.779981 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.779903 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-oauth-config\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.784791 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.784769 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s5wt\" (UniqueName: \"kubernetes.io/projected/7bac8959-343b-4946-a09f-7b7ed1e5a7de-kube-api-access-4s5wt\") pod \"console-747f9454dd-mvkf8\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:02.873164 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:02.873064 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:03.000170 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:03.000148 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-747f9454dd-mvkf8"] Apr 22 20:02:03.002787 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:02:03.002757 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bac8959_343b_4946_a09f_7b7ed1e5a7de.slice/crio-547c36a0597374f6c737253e30d62a9d1b319ee61ab9a8810c561d36224188e1 WatchSource:0}: Error finding container 547c36a0597374f6c737253e30d62a9d1b319ee61ab9a8810c561d36224188e1: Status 404 returned error can't find the container with id 547c36a0597374f6c737253e30d62a9d1b319ee61ab9a8810c561d36224188e1 Apr 22 20:02:03.759413 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:03.759371 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747f9454dd-mvkf8" event={"ID":"7bac8959-343b-4946-a09f-7b7ed1e5a7de","Type":"ContainerStarted","Data":"e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec"} Apr 22 20:02:03.759413 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:03.759417 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747f9454dd-mvkf8" event={"ID":"7bac8959-343b-4946-a09f-7b7ed1e5a7de","Type":"ContainerStarted","Data":"547c36a0597374f6c737253e30d62a9d1b319ee61ab9a8810c561d36224188e1"} Apr 22 20:02:03.778415 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:03.778363 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-747f9454dd-mvkf8" podStartSLOduration=1.7783484600000001 podStartE2EDuration="1.77834846s" podCreationTimestamp="2026-04-22 20:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:02:03.776018148 +0000 UTC m=+271.364509513" watchObservedRunningTime="2026-04-22 20:02:03.77834846 +0000 UTC m=+271.366839788" Apr 22 20:02:12.359306 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:02:12.359234 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8pzdx" podUID="7fa06416-712c-490d-a430-2c086187fab9" Apr 22 20:02:12.359306 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:02:12.359287 2548 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rr5vp" podUID="26ac7310-bd02-469f-9a0e-31a38e294dc3" Apr 22 20:02:12.783679 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:12.783653 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 20:02:12.783825 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:12.783653 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pzdx" Apr 22 20:02:12.873240 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:12.873207 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:12.873404 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:12.873275 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:12.878458 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:12.878434 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:13.790239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:13.790206 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:02:13.843826 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:13.843793 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d9b44885b-vh2jf"] Apr 22 20:02:15.793146 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:15.793108 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 20:02:15.793647 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:15.793153 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 20:02:15.795667 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:15.795639 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fa06416-712c-490d-a430-2c086187fab9-metrics-tls\") pod \"dns-default-8pzdx\" (UID: \"7fa06416-712c-490d-a430-2c086187fab9\") " pod="openshift-dns/dns-default-8pzdx" Apr 22 20:02:15.795780 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:15.795734 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ac7310-bd02-469f-9a0e-31a38e294dc3-cert\") pod \"ingress-canary-rr5vp\" (UID: \"26ac7310-bd02-469f-9a0e-31a38e294dc3\") " pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 20:02:16.087398 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:16.087314 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lxbw7\"" Apr 22 20:02:16.088411 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:16.088396 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tzbrr\"" Apr 22 20:02:16.095473 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:16.095448 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pzdx" Apr 22 20:02:16.095596 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:16.095536 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rr5vp" Apr 22 20:02:16.249633 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:16.249606 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rr5vp"] Apr 22 20:02:16.252043 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:02:16.252002 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ac7310_bd02_469f_9a0e_31a38e294dc3.slice/crio-3cc98ed640ea60684926984ccdc7ef5253faa5a0f27ae6639f11d6b63a295ef8 WatchSource:0}: Error finding container 3cc98ed640ea60684926984ccdc7ef5253faa5a0f27ae6639f11d6b63a295ef8: Status 404 returned error can't find the container with id 3cc98ed640ea60684926984ccdc7ef5253faa5a0f27ae6639f11d6b63a295ef8 Apr 22 20:02:16.272444 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:16.272312 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8pzdx"] Apr 22 20:02:16.274875 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:02:16.274843 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa06416_712c_490d_a430_2c086187fab9.slice/crio-f9a7515814756fb7575e5922c33751dffeef1f2d915eeecb65d3ec8e7e80dbd2 WatchSource:0}: Error finding container f9a7515814756fb7575e5922c33751dffeef1f2d915eeecb65d3ec8e7e80dbd2: Status 404 returned error can't find the container with id f9a7515814756fb7575e5922c33751dffeef1f2d915eeecb65d3ec8e7e80dbd2 Apr 22 20:02:16.796456 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:16.796417 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rr5vp" event={"ID":"26ac7310-bd02-469f-9a0e-31a38e294dc3","Type":"ContainerStarted","Data":"3cc98ed640ea60684926984ccdc7ef5253faa5a0f27ae6639f11d6b63a295ef8"} Apr 22 20:02:16.797722 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:16.797678 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pzdx" event={"ID":"7fa06416-712c-490d-a430-2c086187fab9","Type":"ContainerStarted","Data":"f9a7515814756fb7575e5922c33751dffeef1f2d915eeecb65d3ec8e7e80dbd2"} Apr 22 20:02:18.804793 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:18.804752 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rr5vp" event={"ID":"26ac7310-bd02-469f-9a0e-31a38e294dc3","Type":"ContainerStarted","Data":"fb2fd0da40b6695ef6e0f1442ff713816b102bc65bc92ac4f8af419c5df9ee05"} Apr 22 20:02:18.806375 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:18.806344 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pzdx" event={"ID":"7fa06416-712c-490d-a430-2c086187fab9","Type":"ContainerStarted","Data":"f3822105919e9806e9c9908c1caa199ac7521b89ffe1fb74ebb9b419a51523a5"} Apr 22 20:02:18.806375 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:18.806373 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pzdx" event={"ID":"7fa06416-712c-490d-a430-2c086187fab9","Type":"ContainerStarted","Data":"585fe10c8a3c19c6c9af59b379f8654c480d50ec848129593df88735b8827566"} Apr 22 20:02:18.806572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:18.806467 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8pzdx" Apr 22 20:02:18.821561 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:18.821516 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rr5vp" podStartSLOduration=252.019004908 podStartE2EDuration="4m13.821503743s" podCreationTimestamp="2026-04-22 19:58:05 +0000 UTC" firstStartedPulling="2026-04-22 20:02:16.254406907 +0000 UTC m=+283.842898215" lastFinishedPulling="2026-04-22 20:02:18.056905741 +0000 UTC m=+285.645397050" observedRunningTime="2026-04-22 20:02:18.819827924 +0000 UTC m=+286.408319253" watchObservedRunningTime="2026-04-22 20:02:18.821503743 +0000 UTC m=+286.409995108" Apr 22 20:02:18.835922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:18.835875 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8pzdx" podStartSLOduration=252.057579055 podStartE2EDuration="4m13.835863356s" podCreationTimestamp="2026-04-22 19:58:05 +0000 UTC" firstStartedPulling="2026-04-22 20:02:16.276764435 +0000 UTC m=+283.865255759" lastFinishedPulling="2026-04-22 20:02:18.055048737 +0000 UTC m=+285.643540060" observedRunningTime="2026-04-22 20:02:18.834654092 +0000 UTC m=+286.423145422" watchObservedRunningTime="2026-04-22 20:02:18.835863356 +0000 UTC m=+286.424354684" Apr 22 20:02:28.813625 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:28.813593 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8pzdx" Apr 22 20:02:32.831749 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:32.831715 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:02:32.833059 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:32.833031 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:02:32.835944 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:32.835923 2548 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 20:02:38.862949 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:38.862888 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d9b44885b-vh2jf" podUID="e201cf51-7065-4c2f-8ec5-b64448319fea" containerName="console" containerID="cri-o://6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a" gracePeriod=15 Apr 22 20:02:39.097432 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.097408 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d9b44885b-vh2jf_e201cf51-7065-4c2f-8ec5-b64448319fea/console/0.log" Apr 22 20:02:39.097556 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.097472 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:02:39.180296 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180198 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-oauth-config\") pod \"e201cf51-7065-4c2f-8ec5-b64448319fea\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " Apr 22 20:02:39.180296 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180242 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-trusted-ca-bundle\") pod \"e201cf51-7065-4c2f-8ec5-b64448319fea\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " Apr 22 20:02:39.180296 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180294 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-serving-cert\") pod \"e201cf51-7065-4c2f-8ec5-b64448319fea\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " Apr 22 20:02:39.180572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180331 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-oauth-serving-cert\") pod \"e201cf51-7065-4c2f-8ec5-b64448319fea\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " Apr 22 20:02:39.180572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180371 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wsm\" (UniqueName: \"kubernetes.io/projected/e201cf51-7065-4c2f-8ec5-b64448319fea-kube-api-access-s7wsm\") pod \"e201cf51-7065-4c2f-8ec5-b64448319fea\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " Apr 22 20:02:39.180572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180397 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-service-ca\") pod \"e201cf51-7065-4c2f-8ec5-b64448319fea\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " Apr 22 20:02:39.180572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180437 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-console-config\") pod \"e201cf51-7065-4c2f-8ec5-b64448319fea\" (UID: \"e201cf51-7065-4c2f-8ec5-b64448319fea\") " Apr 22 20:02:39.180833 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180802 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e201cf51-7065-4c2f-8ec5-b64448319fea" (UID: "e201cf51-7065-4c2f-8ec5-b64448319fea"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:02:39.180898 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180820 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e201cf51-7065-4c2f-8ec5-b64448319fea" (UID: "e201cf51-7065-4c2f-8ec5-b64448319fea"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:02:39.180898 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180854 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-service-ca" (OuterVolumeSpecName: "service-ca") pod "e201cf51-7065-4c2f-8ec5-b64448319fea" (UID: "e201cf51-7065-4c2f-8ec5-b64448319fea"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:02:39.180898 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180859 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-console-config" (OuterVolumeSpecName: "console-config") pod "e201cf51-7065-4c2f-8ec5-b64448319fea" (UID: "e201cf51-7065-4c2f-8ec5-b64448319fea"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:02:39.181007 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180976 2548 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-console-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:02:39.181007 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.180994 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-trusted-ca-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:02:39.181083 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.181006 2548 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-oauth-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:02:39.181083 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.181017 2548 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e201cf51-7065-4c2f-8ec5-b64448319fea-service-ca\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:02:39.182729 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.182707 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e201cf51-7065-4c2f-8ec5-b64448319fea" (UID: "e201cf51-7065-4c2f-8ec5-b64448319fea"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:02:39.183030 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.183005 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e201cf51-7065-4c2f-8ec5-b64448319fea" (UID: "e201cf51-7065-4c2f-8ec5-b64448319fea"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:02:39.183030 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.183019 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e201cf51-7065-4c2f-8ec5-b64448319fea-kube-api-access-s7wsm" (OuterVolumeSpecName: "kube-api-access-s7wsm") pod "e201cf51-7065-4c2f-8ec5-b64448319fea" (UID: "e201cf51-7065-4c2f-8ec5-b64448319fea"). InnerVolumeSpecName "kube-api-access-s7wsm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:02:39.281752 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.281700 2548 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-oauth-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:02:39.281752 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.281746 2548 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e201cf51-7065-4c2f-8ec5-b64448319fea-console-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:02:39.281752 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.281757 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7wsm\" (UniqueName: \"kubernetes.io/projected/e201cf51-7065-4c2f-8ec5-b64448319fea-kube-api-access-s7wsm\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:02:39.873362 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.873335 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d9b44885b-vh2jf_e201cf51-7065-4c2f-8ec5-b64448319fea/console/0.log" Apr 22 20:02:39.873755 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.873378 2548 generic.go:358] "Generic (PLEG): container finished" podID="e201cf51-7065-4c2f-8ec5-b64448319fea" containerID="6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a" exitCode=2 Apr 22 20:02:39.873755 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.873412 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9b44885b-vh2jf" event={"ID":"e201cf51-7065-4c2f-8ec5-b64448319fea","Type":"ContainerDied","Data":"6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a"} Apr 22 20:02:39.873755 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.873451 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9b44885b-vh2jf" event={"ID":"e201cf51-7065-4c2f-8ec5-b64448319fea","Type":"ContainerDied","Data":"fffd0a62a4e74936fa0764443c742658d4d7e927abb0d67536f4412782018a1f"} Apr 22 20:02:39.873755 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.873455 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9b44885b-vh2jf" Apr 22 20:02:39.873755 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.873466 2548 scope.go:117] "RemoveContainer" containerID="6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a" Apr 22 20:02:39.882002 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.881986 2548 scope.go:117] "RemoveContainer" containerID="6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a" Apr 22 20:02:39.882230 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:02:39.882215 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a\": container with ID starting with 6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a not found: ID does not exist" containerID="6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a" Apr 22 20:02:39.882307 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.882237 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a"} err="failed to get container status \"6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a\": rpc error: code = NotFound desc = could not find container \"6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a\": container with ID starting with 6947557f0723c6505e12705066a07dc0192fbefb1564ddfe21e1c009d5b6073a not found: ID does not exist" Apr 22 20:02:39.893978 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.893949 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d9b44885b-vh2jf"] Apr 22 20:02:39.896975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:39.896955 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d9b44885b-vh2jf"] Apr 22 20:02:40.955928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:02:40.955890 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e201cf51-7065-4c2f-8ec5-b64448319fea" path="/var/lib/kubelet/pods/e201cf51-7065-4c2f-8ec5-b64448319fea/volumes" Apr 22 20:03:30.480471 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.480436 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59c9c9c9c7-lbd2s"] Apr 22 20:03:30.480902 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.480779 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e201cf51-7065-4c2f-8ec5-b64448319fea" containerName="console" Apr 22 20:03:30.480902 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.480793 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="e201cf51-7065-4c2f-8ec5-b64448319fea" containerName="console" Apr 22 20:03:30.480902 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.480847 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="e201cf51-7065-4c2f-8ec5-b64448319fea" containerName="console" Apr 22 20:03:30.483681 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.483666 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.493207 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.493164 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c9c9c9c7-lbd2s"] Apr 22 20:03:30.495436 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.495412 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-config\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.495565 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.495454 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-service-ca\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.495565 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.495508 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-oauth-config\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.495565 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.495530 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65ncq\" (UniqueName: \"kubernetes.io/projected/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-kube-api-access-65ncq\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.495565 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.495552 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-oauth-serving-cert\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.495766 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.495621 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-trusted-ca-bundle\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.495766 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.495717 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-serving-cert\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.596970 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.596935 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-service-ca\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.596980 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-oauth-config\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.597003 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65ncq\" (UniqueName: \"kubernetes.io/projected/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-kube-api-access-65ncq\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.597019 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-oauth-serving-cert\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.597049 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-trusted-ca-bundle\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.597087 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-serving-cert\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597156 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.597126 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-config\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597835 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.597810 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-service-ca\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597974 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.597898 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-config\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.597974 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.597874 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-oauth-serving-cert\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.598588 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.598562 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-trusted-ca-bundle\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.599675 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.599646 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-oauth-config\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.599945 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.599924 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-serving-cert\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.605265 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.605229 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65ncq\" (UniqueName: \"kubernetes.io/projected/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-kube-api-access-65ncq\") pod \"console-59c9c9c9c7-lbd2s\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.793523 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.793420 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:30.914467 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.914401 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c9c9c9c7-lbd2s"] Apr 22 20:03:30.917104 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:03:30.917071 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28266ac7_5fb8_4cf5_aaaf_0cfd5246cd48.slice/crio-8faba7f577ab60e7b69b4ae18dc58a26a90d0678c95dcfb5e077769315d3291e WatchSource:0}: Error finding container 8faba7f577ab60e7b69b4ae18dc58a26a90d0678c95dcfb5e077769315d3291e: Status 404 returned error can't find the container with id 8faba7f577ab60e7b69b4ae18dc58a26a90d0678c95dcfb5e077769315d3291e Apr 22 20:03:30.918844 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:30.918829 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:03:31.022385 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:31.022348 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c9c9c9c7-lbd2s" event={"ID":"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48","Type":"ContainerStarted","Data":"003bcfc280bb0f47627dc526ea53f2d335f18623c564259504b53a414cfd6455"} Apr 22 20:03:31.022385 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:31.022385 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c9c9c9c7-lbd2s" event={"ID":"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48","Type":"ContainerStarted","Data":"8faba7f577ab60e7b69b4ae18dc58a26a90d0678c95dcfb5e077769315d3291e"} Apr 22 20:03:31.040895 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:31.040833 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59c9c9c9c7-lbd2s" podStartSLOduration=1.04081271 podStartE2EDuration="1.04081271s" podCreationTimestamp="2026-04-22 20:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:03:31.039899069 +0000 UTC m=+358.628390404" watchObservedRunningTime="2026-04-22 20:03:31.04081271 +0000 UTC m=+358.629304040" Apr 22 20:03:40.793613 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:40.793571 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:40.794133 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:40.793626 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:40.803714 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:40.803688 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:41.054580 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:41.054490 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:03:41.096243 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:03:41.096214 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-747f9454dd-mvkf8"] Apr 22 20:04:06.119648 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.119579 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-747f9454dd-mvkf8" podUID="7bac8959-343b-4946-a09f-7b7ed1e5a7de" containerName="console" containerID="cri-o://e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec" gracePeriod=15 Apr 22 20:04:06.358318 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.358295 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-747f9454dd-mvkf8_7bac8959-343b-4946-a09f-7b7ed1e5a7de/console/0.log" Apr 22 20:04:06.358448 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.358358 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:04:06.502308 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502278 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-oauth-serving-cert\") pod \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " Apr 22 20:04:06.502487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502339 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s5wt\" (UniqueName: \"kubernetes.io/projected/7bac8959-343b-4946-a09f-7b7ed1e5a7de-kube-api-access-4s5wt\") pod \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " Apr 22 20:04:06.502487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502363 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-service-ca\") pod \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " Apr 22 20:04:06.502487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502414 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-trusted-ca-bundle\") pod \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " Apr 22 20:04:06.502487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502435 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-oauth-config\") pod \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " Apr 22 20:04:06.502487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502472 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-serving-cert\") pod \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " Apr 22 20:04:06.502743 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502503 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-config\") pod \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\" (UID: \"7bac8959-343b-4946-a09f-7b7ed1e5a7de\") " Apr 22 20:04:06.502796 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502730 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7bac8959-343b-4946-a09f-7b7ed1e5a7de" (UID: "7bac8959-343b-4946-a09f-7b7ed1e5a7de"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:04:06.502927 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502895 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7bac8959-343b-4946-a09f-7b7ed1e5a7de" (UID: "7bac8959-343b-4946-a09f-7b7ed1e5a7de"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:04:06.503001 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.502941 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-config" (OuterVolumeSpecName: "console-config") pod "7bac8959-343b-4946-a09f-7b7ed1e5a7de" (UID: "7bac8959-343b-4946-a09f-7b7ed1e5a7de"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:04:06.503070 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.503052 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-service-ca" (OuterVolumeSpecName: "service-ca") pod "7bac8959-343b-4946-a09f-7b7ed1e5a7de" (UID: "7bac8959-343b-4946-a09f-7b7ed1e5a7de"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:04:06.504658 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.504635 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bac8959-343b-4946-a09f-7b7ed1e5a7de-kube-api-access-4s5wt" (OuterVolumeSpecName: "kube-api-access-4s5wt") pod "7bac8959-343b-4946-a09f-7b7ed1e5a7de" (UID: "7bac8959-343b-4946-a09f-7b7ed1e5a7de"). InnerVolumeSpecName "kube-api-access-4s5wt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:04:06.504780 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.504766 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7bac8959-343b-4946-a09f-7b7ed1e5a7de" (UID: "7bac8959-343b-4946-a09f-7b7ed1e5a7de"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:04:06.504925 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.504905 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7bac8959-343b-4946-a09f-7b7ed1e5a7de" (UID: "7bac8959-343b-4946-a09f-7b7ed1e5a7de"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:04:06.603541 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.603495 2548 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:04:06.603541 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.603536 2548 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:04:06.603541 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.603547 2548 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-oauth-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:04:06.603541 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.603556 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4s5wt\" (UniqueName: \"kubernetes.io/projected/7bac8959-343b-4946-a09f-7b7ed1e5a7de-kube-api-access-4s5wt\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:04:06.603803 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.603567 2548 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-service-ca\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:04:06.603803 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.603577 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bac8959-343b-4946-a09f-7b7ed1e5a7de-trusted-ca-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:04:06.603803 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:06.603585 2548 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bac8959-343b-4946-a09f-7b7ed1e5a7de-console-oauth-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:04:07.126084 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.126054 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-747f9454dd-mvkf8_7bac8959-343b-4946-a09f-7b7ed1e5a7de/console/0.log" Apr 22 20:04:07.126492 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.126093 2548 generic.go:358] "Generic (PLEG): container finished" podID="7bac8959-343b-4946-a09f-7b7ed1e5a7de" containerID="e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec" exitCode=2 Apr 22 20:04:07.126492 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.126126 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747f9454dd-mvkf8" event={"ID":"7bac8959-343b-4946-a09f-7b7ed1e5a7de","Type":"ContainerDied","Data":"e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec"} Apr 22 20:04:07.126492 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.126167 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747f9454dd-mvkf8" Apr 22 20:04:07.126492 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.126177 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747f9454dd-mvkf8" event={"ID":"7bac8959-343b-4946-a09f-7b7ed1e5a7de","Type":"ContainerDied","Data":"547c36a0597374f6c737253e30d62a9d1b319ee61ab9a8810c561d36224188e1"} Apr 22 20:04:07.126492 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.126196 2548 scope.go:117] "RemoveContainer" containerID="e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec" Apr 22 20:04:07.134124 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.134101 2548 scope.go:117] "RemoveContainer" containerID="e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec" Apr 22 20:04:07.134450 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:07.134431 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec\": container with ID starting with e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec not found: ID does not exist" containerID="e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec" Apr 22 20:04:07.134515 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.134458 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec"} err="failed to get container status \"e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec\": rpc error: code = NotFound desc = could not find container \"e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec\": container with ID starting with e74ed1c079549ccf92abf1cf00b688067ce4b44efa6df59277506a181f96f9ec not found: ID does not exist" Apr 22 20:04:07.142821 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.142799 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-747f9454dd-mvkf8"] Apr 22 20:04:07.148435 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:07.148415 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-747f9454dd-mvkf8"] Apr 22 20:04:08.951994 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:08.951960 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bac8959-343b-4946-a09f-7b7ed1e5a7de" path="/var/lib/kubelet/pods/7bac8959-343b-4946-a09f-7b7ed1e5a7de/volumes" Apr 22 20:04:16.107263 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.107214 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj"] Apr 22 20:04:16.107728 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.107521 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bac8959-343b-4946-a09f-7b7ed1e5a7de" containerName="console" Apr 22 20:04:16.107728 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.107532 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bac8959-343b-4946-a09f-7b7ed1e5a7de" containerName="console" Apr 22 20:04:16.107728 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.107589 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bac8959-343b-4946-a09f-7b7ed1e5a7de" containerName="console" Apr 22 20:04:16.110309 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.110292 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" Apr 22 20:04:16.114268 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.114221 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 20:04:16.114396 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.114222 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-pc89f\"" Apr 22 20:04:16.114396 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.114228 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 20:04:16.114396 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.114294 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 20:04:16.114396 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.114311 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 20:04:16.119228 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.119204 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj"] Apr 22 20:04:16.182968 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.182937 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67gwp\" (UniqueName: \"kubernetes.io/projected/5f034d55-e937-496b-926f-f3a6a2340fae-kube-api-access-67gwp\") pod \"managed-serviceaccount-addon-agent-745777c49c-9mlgj\" (UID: \"5f034d55-e937-496b-926f-f3a6a2340fae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" Apr 22 20:04:16.183133 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.182979 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5f034d55-e937-496b-926f-f3a6a2340fae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-745777c49c-9mlgj\" (UID: \"5f034d55-e937-496b-926f-f3a6a2340fae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" Apr 22 20:04:16.283392 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.283357 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5f034d55-e937-496b-926f-f3a6a2340fae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-745777c49c-9mlgj\" (UID: \"5f034d55-e937-496b-926f-f3a6a2340fae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" Apr 22 20:04:16.283571 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.283453 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67gwp\" (UniqueName: \"kubernetes.io/projected/5f034d55-e937-496b-926f-f3a6a2340fae-kube-api-access-67gwp\") pod \"managed-serviceaccount-addon-agent-745777c49c-9mlgj\" (UID: \"5f034d55-e937-496b-926f-f3a6a2340fae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" Apr 22 20:04:16.286023 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.285996 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5f034d55-e937-496b-926f-f3a6a2340fae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-745777c49c-9mlgj\" (UID: \"5f034d55-e937-496b-926f-f3a6a2340fae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" Apr 22 20:04:16.291148 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.291130 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67gwp\" (UniqueName: \"kubernetes.io/projected/5f034d55-e937-496b-926f-f3a6a2340fae-kube-api-access-67gwp\") pod \"managed-serviceaccount-addon-agent-745777c49c-9mlgj\" (UID: \"5f034d55-e937-496b-926f-f3a6a2340fae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" Apr 22 20:04:16.432268 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.432174 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" Apr 22 20:04:16.559187 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:16.559155 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj"] Apr 22 20:04:16.562010 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:04:16.561983 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f034d55_e937_496b_926f_f3a6a2340fae.slice/crio-37a37c6093a7422487d7ff2ec1776ff4503614d3f59104092b2fd5523927b76d WatchSource:0}: Error finding container 37a37c6093a7422487d7ff2ec1776ff4503614d3f59104092b2fd5523927b76d: Status 404 returned error can't find the container with id 37a37c6093a7422487d7ff2ec1776ff4503614d3f59104092b2fd5523927b76d Apr 22 20:04:17.161660 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:17.161616 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" event={"ID":"5f034d55-e937-496b-926f-f3a6a2340fae","Type":"ContainerStarted","Data":"37a37c6093a7422487d7ff2ec1776ff4503614d3f59104092b2fd5523927b76d"} Apr 22 20:04:19.168944 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:19.168852 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" event={"ID":"5f034d55-e937-496b-926f-f3a6a2340fae","Type":"ContainerStarted","Data":"377f02264e10650af8243af0699eb94e6d6b311d8c52692a674b2e9e09886e8f"} Apr 22 20:04:19.185201 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:19.185154 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-745777c49c-9mlgj" podStartSLOduration=0.961319196 podStartE2EDuration="3.185141048s" podCreationTimestamp="2026-04-22 20:04:16 +0000 UTC" firstStartedPulling="2026-04-22 20:04:16.563882231 +0000 UTC m=+404.152373537" lastFinishedPulling="2026-04-22 20:04:18.787704082 +0000 UTC m=+406.376195389" observedRunningTime="2026-04-22 20:04:19.183445021 +0000 UTC m=+406.771936350" watchObservedRunningTime="2026-04-22 20:04:19.185141048 +0000 UTC m=+406.773632374" Apr 22 20:04:39.084892 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.084860 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5"] Apr 22 20:04:39.089382 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.089359 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:39.092984 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.092959 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 20:04:39.093477 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.093433 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 20:04:39.093477 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.093449 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 20:04:39.093611 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.093577 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-5x28f\"" Apr 22 20:04:39.105812 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.105788 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5"] Apr 22 20:04:39.150451 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.150420 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td59p\" (UniqueName: \"kubernetes.io/projected/fd822622-7a28-4f0a-a466-244e8e72d202-kube-api-access-td59p\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5\" (UID: \"fd822622-7a28-4f0a-a466-244e8e72d202\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:39.150626 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.150494 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fd822622-7a28-4f0a-a466-244e8e72d202-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5\" (UID: \"fd822622-7a28-4f0a-a466-244e8e72d202\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:39.251081 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.251047 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td59p\" (UniqueName: \"kubernetes.io/projected/fd822622-7a28-4f0a-a466-244e8e72d202-kube-api-access-td59p\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5\" (UID: \"fd822622-7a28-4f0a-a466-244e8e72d202\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:39.251218 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.251103 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fd822622-7a28-4f0a-a466-244e8e72d202-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5\" (UID: \"fd822622-7a28-4f0a-a466-244e8e72d202\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:39.253617 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.253577 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fd822622-7a28-4f0a-a466-244e8e72d202-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5\" (UID: \"fd822622-7a28-4f0a-a466-244e8e72d202\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:39.259028 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.259001 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td59p\" (UniqueName: \"kubernetes.io/projected/fd822622-7a28-4f0a-a466-244e8e72d202-kube-api-access-td59p\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5\" (UID: \"fd822622-7a28-4f0a-a466-244e8e72d202\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:39.399302 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.399182 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:39.529628 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:39.529589 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5"] Apr 22 20:04:39.533393 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:04:39.533366 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd822622_7a28_4f0a_a466_244e8e72d202.slice/crio-69b9bdae8ad12169baf2e0e2435b48ea8f58dd9b52f45897345049f23c156937 WatchSource:0}: Error finding container 69b9bdae8ad12169baf2e0e2435b48ea8f58dd9b52f45897345049f23c156937: Status 404 returned error can't find the container with id 69b9bdae8ad12169baf2e0e2435b48ea8f58dd9b52f45897345049f23c156937 Apr 22 20:04:40.228574 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:40.228534 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" event={"ID":"fd822622-7a28-4f0a-a466-244e8e72d202","Type":"ContainerStarted","Data":"69b9bdae8ad12169baf2e0e2435b48ea8f58dd9b52f45897345049f23c156937"} Apr 22 20:04:43.242707 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.242671 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" event={"ID":"fd822622-7a28-4f0a-a466-244e8e72d202","Type":"ContainerStarted","Data":"5336685e6392f5c5558143541b1973ef02e863a3bc2416ca1a0ef0c0dfa425ed"} Apr 22 20:04:43.243092 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.242839 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:04:43.270928 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.270872 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" podStartSLOduration=0.721106397 podStartE2EDuration="4.270853021s" podCreationTimestamp="2026-04-22 20:04:39 +0000 UTC" firstStartedPulling="2026-04-22 20:04:39.535243059 +0000 UTC m=+427.123734365" lastFinishedPulling="2026-04-22 20:04:43.084989674 +0000 UTC m=+430.673480989" observedRunningTime="2026-04-22 20:04:43.268986002 +0000 UTC m=+430.857477342" watchObservedRunningTime="2026-04-22 20:04:43.270853021 +0000 UTC m=+430.859344396" Apr 22 20:04:43.613058 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.613011 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f5jvz"] Apr 22 20:04:43.616348 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.616330 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.618814 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.618786 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 20:04:43.618814 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.618811 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 20:04:43.619002 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.618786 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-2jhrf\"" Apr 22 20:04:43.625854 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.625825 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f5jvz"] Apr 22 20:04:43.691343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.691308 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/fc7217f1-10c5-45f1-996f-ee2a270254bd-cabundle0\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.691343 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.691354 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgcm\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-kube-api-access-gkgcm\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.691549 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.691374 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.792668 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.792633 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgcm\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-kube-api-access-gkgcm\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.792668 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.792669 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.792868 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.792730 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/fc7217f1-10c5-45f1-996f-ee2a270254bd-cabundle0\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.792905 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:43.792869 2548 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 20:04:43.792905 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:43.792893 2548 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:04:43.792905 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:43.792900 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:04:43.793048 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:43.792913 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f5jvz: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 20:04:43.793048 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:43.792979 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates podName:fc7217f1-10c5-45f1-996f-ee2a270254bd nodeName:}" failed. No retries permitted until 2026-04-22 20:04:44.292962282 +0000 UTC m=+431.881453589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates") pod "keda-operator-ffbb595cb-f5jvz" (UID: "fc7217f1-10c5-45f1-996f-ee2a270254bd") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 20:04:43.793302 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.793281 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/fc7217f1-10c5-45f1-996f-ee2a270254bd-cabundle0\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.803191 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.803166 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgcm\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-kube-api-access-gkgcm\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:43.920880 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.920799 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp"] Apr 22 20:04:43.924129 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.924112 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:43.926807 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.926785 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 20:04:43.931653 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.931632 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp"] Apr 22 20:04:43.994799 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.994768 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ebf9d827-13eb-44a1-a841-0e24ae998e78-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:43.994956 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.994876 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:43.994956 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:43.994901 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msdt5\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-kube-api-access-msdt5\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:44.095739 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.095703 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msdt5\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-kube-api-access-msdt5\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:44.095927 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.095773 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ebf9d827-13eb-44a1-a841-0e24ae998e78-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:44.095927 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.095874 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:44.096053 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.095974 2548 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:04:44.096053 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.095990 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:04:44.096053 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.096007 2548 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 20:04:44.096053 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.096030 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 20:04:44.096231 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.096112 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates podName:ebf9d827-13eb-44a1-a841-0e24ae998e78 nodeName:}" failed. No retries permitted until 2026-04-22 20:04:44.596092315 +0000 UTC m=+432.184583625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates") pod "keda-metrics-apiserver-7c9f485588-7s7lp" (UID: "ebf9d827-13eb-44a1-a841-0e24ae998e78") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 20:04:44.096231 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.096198 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ebf9d827-13eb-44a1-a841-0e24ae998e78-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:44.107507 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.107474 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msdt5\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-kube-api-access-msdt5\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:44.220421 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.220345 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-5967l"] Apr 22 20:04:44.223622 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.223607 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:44.226414 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.226394 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 20:04:44.232450 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.232427 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-5967l"] Apr 22 20:04:44.298424 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.298393 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:44.298758 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.298439 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbqp\" (UniqueName: \"kubernetes.io/projected/b9dc7534-2ad7-4297-aad1-38a4ec791392-kube-api-access-ffbqp\") pod \"keda-admission-cf49989db-5967l\" (UID: \"b9dc7534-2ad7-4297-aad1-38a4ec791392\") " pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:44.298758 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.298507 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9dc7534-2ad7-4297-aad1-38a4ec791392-certificates\") pod \"keda-admission-cf49989db-5967l\" (UID: \"b9dc7534-2ad7-4297-aad1-38a4ec791392\") " pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:44.298758 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.298527 2548 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:04:44.298758 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.298544 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:04:44.298758 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.298552 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f5jvz: references non-existent secret key: ca.crt Apr 22 20:04:44.298758 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.298600 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates podName:fc7217f1-10c5-45f1-996f-ee2a270254bd nodeName:}" failed. No retries permitted until 2026-04-22 20:04:45.298585767 +0000 UTC m=+432.887077074 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates") pod "keda-operator-ffbb595cb-f5jvz" (UID: "fc7217f1-10c5-45f1-996f-ee2a270254bd") : references non-existent secret key: ca.crt Apr 22 20:04:44.399453 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.399422 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbqp\" (UniqueName: \"kubernetes.io/projected/b9dc7534-2ad7-4297-aad1-38a4ec791392-kube-api-access-ffbqp\") pod \"keda-admission-cf49989db-5967l\" (UID: \"b9dc7534-2ad7-4297-aad1-38a4ec791392\") " pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:44.399603 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.399506 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9dc7534-2ad7-4297-aad1-38a4ec791392-certificates\") pod \"keda-admission-cf49989db-5967l\" (UID: \"b9dc7534-2ad7-4297-aad1-38a4ec791392\") " pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:44.402043 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.402025 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9dc7534-2ad7-4297-aad1-38a4ec791392-certificates\") pod \"keda-admission-cf49989db-5967l\" (UID: \"b9dc7534-2ad7-4297-aad1-38a4ec791392\") " pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:44.409587 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.409561 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbqp\" (UniqueName: \"kubernetes.io/projected/b9dc7534-2ad7-4297-aad1-38a4ec791392-kube-api-access-ffbqp\") pod \"keda-admission-cf49989db-5967l\" (UID: \"b9dc7534-2ad7-4297-aad1-38a4ec791392\") " pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:44.534869 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.534827 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:44.601193 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.601162 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:44.601408 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.601387 2548 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:04:44.601490 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.601412 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:04:44.601490 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.601435 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp: references non-existent secret key: tls.crt Apr 22 20:04:44.601580 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:44.601505 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates podName:ebf9d827-13eb-44a1-a841-0e24ae998e78 nodeName:}" failed. No retries permitted until 2026-04-22 20:04:45.601482754 +0000 UTC m=+433.189974067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates") pod "keda-metrics-apiserver-7c9f485588-7s7lp" (UID: "ebf9d827-13eb-44a1-a841-0e24ae998e78") : references non-existent secret key: tls.crt Apr 22 20:04:44.660532 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:44.660496 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-5967l"] Apr 22 20:04:44.664991 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:04:44.664963 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9dc7534_2ad7_4297_aad1_38a4ec791392.slice/crio-b31c16afaff7b6ce41cf6f489c704f28b6b91c0f26a92e65e5b1b23ff8a6e5b1 WatchSource:0}: Error finding container b31c16afaff7b6ce41cf6f489c704f28b6b91c0f26a92e65e5b1b23ff8a6e5b1: Status 404 returned error can't find the container with id b31c16afaff7b6ce41cf6f489c704f28b6b91c0f26a92e65e5b1b23ff8a6e5b1 Apr 22 20:04:45.250855 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:45.250817 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-5967l" event={"ID":"b9dc7534-2ad7-4297-aad1-38a4ec791392","Type":"ContainerStarted","Data":"b31c16afaff7b6ce41cf6f489c704f28b6b91c0f26a92e65e5b1b23ff8a6e5b1"} Apr 22 20:04:45.306040 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:45.305998 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:45.306429 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:45.306157 2548 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:04:45.306429 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:45.306175 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:04:45.306429 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:45.306183 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f5jvz: references non-existent secret key: ca.crt Apr 22 20:04:45.306429 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:45.306237 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates podName:fc7217f1-10c5-45f1-996f-ee2a270254bd nodeName:}" failed. No retries permitted until 2026-04-22 20:04:47.306220824 +0000 UTC m=+434.894712131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates") pod "keda-operator-ffbb595cb-f5jvz" (UID: "fc7217f1-10c5-45f1-996f-ee2a270254bd") : references non-existent secret key: ca.crt Apr 22 20:04:45.608645 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:45.608521 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:45.608816 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:45.608655 2548 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:04:45.608816 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:45.608669 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:04:45.608816 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:45.608688 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp: references non-existent secret key: tls.crt Apr 22 20:04:45.608816 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:45.608741 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates podName:ebf9d827-13eb-44a1-a841-0e24ae998e78 nodeName:}" failed. No retries permitted until 2026-04-22 20:04:47.608726913 +0000 UTC m=+435.197218220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates") pod "keda-metrics-apiserver-7c9f485588-7s7lp" (UID: "ebf9d827-13eb-44a1-a841-0e24ae998e78") : references non-existent secret key: tls.crt Apr 22 20:04:47.258641 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:47.258607 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-5967l" event={"ID":"b9dc7534-2ad7-4297-aad1-38a4ec791392","Type":"ContainerStarted","Data":"3d4ee9497968283ee6a06178d37bde2440f9355e1c65e49087dccbb68316046d"} Apr 22 20:04:47.259196 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:47.258727 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:04:47.274417 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:47.274371 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-5967l" podStartSLOduration=1.743538598 podStartE2EDuration="3.274356417s" podCreationTimestamp="2026-04-22 20:04:44 +0000 UTC" firstStartedPulling="2026-04-22 20:04:44.66641436 +0000 UTC m=+432.254905671" lastFinishedPulling="2026-04-22 20:04:46.197232172 +0000 UTC m=+433.785723490" observedRunningTime="2026-04-22 20:04:47.272902372 +0000 UTC m=+434.861393701" watchObservedRunningTime="2026-04-22 20:04:47.274356417 +0000 UTC m=+434.862847746" Apr 22 20:04:47.323921 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:47.323885 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:47.324098 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:47.324023 2548 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:04:47.324098 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:47.324043 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:04:47.324098 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:47.324052 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f5jvz: references non-existent secret key: ca.crt Apr 22 20:04:47.324313 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:47.324105 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates podName:fc7217f1-10c5-45f1-996f-ee2a270254bd nodeName:}" failed. No retries permitted until 2026-04-22 20:04:51.324090021 +0000 UTC m=+438.912581328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates") pod "keda-operator-ffbb595cb-f5jvz" (UID: "fc7217f1-10c5-45f1-996f-ee2a270254bd") : references non-existent secret key: ca.crt Apr 22 20:04:47.626721 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:47.626619 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:47.626877 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:47.626758 2548 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:04:47.626877 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:47.626777 2548 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:04:47.626877 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:47.626796 2548 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp: references non-existent secret key: tls.crt Apr 22 20:04:47.626877 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:04:47.626851 2548 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates podName:ebf9d827-13eb-44a1-a841-0e24ae998e78 nodeName:}" failed. No retries permitted until 2026-04-22 20:04:51.626834729 +0000 UTC m=+439.215326035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates") pod "keda-metrics-apiserver-7c9f485588-7s7lp" (UID: "ebf9d827-13eb-44a1-a841-0e24ae998e78") : references non-existent secret key: tls.crt Apr 22 20:04:51.361094 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:51.361054 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:51.363629 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:51.363594 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fc7217f1-10c5-45f1-996f-ee2a270254bd-certificates\") pod \"keda-operator-ffbb595cb-f5jvz\" (UID: \"fc7217f1-10c5-45f1-996f-ee2a270254bd\") " pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:51.431395 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:51.431358 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:51.554025 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:51.553996 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f5jvz"] Apr 22 20:04:51.556399 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:04:51.556367 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc7217f1_10c5_45f1_996f_ee2a270254bd.slice/crio-3dab4afaeee0c491735c154fdc4fb74ec598cd4d864bc836ce3e5e4dd99bde8a WatchSource:0}: Error finding container 3dab4afaeee0c491735c154fdc4fb74ec598cd4d864bc836ce3e5e4dd99bde8a: Status 404 returned error can't find the container with id 3dab4afaeee0c491735c154fdc4fb74ec598cd4d864bc836ce3e5e4dd99bde8a Apr 22 20:04:51.664905 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:51.664815 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:51.667588 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:51.667561 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ebf9d827-13eb-44a1-a841-0e24ae998e78-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7s7lp\" (UID: \"ebf9d827-13eb-44a1-a841-0e24ae998e78\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:51.736011 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:51.735978 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:51.857338 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:51.857312 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp"] Apr 22 20:04:51.859825 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:04:51.859792 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf9d827_13eb_44a1_a841_0e24ae998e78.slice/crio-74eff1e6935044089d37c9f3a8f3c99bec544bcdf400f0ff3a71de59104b9bb7 WatchSource:0}: Error finding container 74eff1e6935044089d37c9f3a8f3c99bec544bcdf400f0ff3a71de59104b9bb7: Status 404 returned error can't find the container with id 74eff1e6935044089d37c9f3a8f3c99bec544bcdf400f0ff3a71de59104b9bb7 Apr 22 20:04:52.276799 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:52.276765 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" event={"ID":"ebf9d827-13eb-44a1-a841-0e24ae998e78","Type":"ContainerStarted","Data":"74eff1e6935044089d37c9f3a8f3c99bec544bcdf400f0ff3a71de59104b9bb7"} Apr 22 20:04:52.277873 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:52.277837 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" event={"ID":"fc7217f1-10c5-45f1-996f-ee2a270254bd","Type":"ContainerStarted","Data":"3dab4afaeee0c491735c154fdc4fb74ec598cd4d864bc836ce3e5e4dd99bde8a"} Apr 22 20:04:56.293165 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:56.293125 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" event={"ID":"ebf9d827-13eb-44a1-a841-0e24ae998e78","Type":"ContainerStarted","Data":"442421c4d881b54043aa67ece57ae701eb450ff088bc5511fc91b0b180d58410"} Apr 22 20:04:56.293665 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:56.293286 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:04:56.297299 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:56.297267 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" event={"ID":"fc7217f1-10c5-45f1-996f-ee2a270254bd","Type":"ContainerStarted","Data":"81d93d3e2a1e202eef90facfc076f9dd81b733f2ce448dded2292cd4939f763c"} Apr 22 20:04:56.297466 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:56.297393 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:04:56.309707 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:56.309666 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" podStartSLOduration=9.684708805 podStartE2EDuration="13.309652289s" podCreationTimestamp="2026-04-22 20:04:43 +0000 UTC" firstStartedPulling="2026-04-22 20:04:51.861197008 +0000 UTC m=+439.449688319" lastFinishedPulling="2026-04-22 20:04:55.486140483 +0000 UTC m=+443.074631803" observedRunningTime="2026-04-22 20:04:56.308967525 +0000 UTC m=+443.897458855" watchObservedRunningTime="2026-04-22 20:04:56.309652289 +0000 UTC m=+443.898143617" Apr 22 20:04:56.324647 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:04:56.324599 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" podStartSLOduration=9.394749931 podStartE2EDuration="13.324585639s" podCreationTimestamp="2026-04-22 20:04:43 +0000 UTC" firstStartedPulling="2026-04-22 20:04:51.557703636 +0000 UTC m=+439.146194943" lastFinishedPulling="2026-04-22 20:04:55.487539335 +0000 UTC m=+443.076030651" observedRunningTime="2026-04-22 20:04:56.323187718 +0000 UTC m=+443.911679046" watchObservedRunningTime="2026-04-22 20:04:56.324585639 +0000 UTC m=+443.913076968" Apr 22 20:05:04.248792 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:04.248750 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qjsp5" Apr 22 20:05:07.305514 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:07.305487 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7s7lp" Apr 22 20:05:08.264491 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:08.264454 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-5967l" Apr 22 20:05:17.302722 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:17.302684 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-f5jvz" Apr 22 20:05:49.561757 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.561720 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-665c47d676-ktbsh"] Apr 22 20:05:49.563876 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.563861 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:49.566508 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.566485 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 20:05:49.566642 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.566522 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:05:49.567704 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.567684 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-9mqfq\"" Apr 22 20:05:49.567704 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.567696 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:05:49.573681 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.573659 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-ktbsh"] Apr 22 20:05:49.591941 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.591914 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-g7qwk"] Apr 22 20:05:49.594187 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.594171 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:49.596837 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.596816 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-q9sqg\"" Apr 22 20:05:49.596949 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.596818 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 20:05:49.602635 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.602610 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-g7qwk"] Apr 22 20:05:49.653326 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.653243 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/08b05a5c-381d-4156-9f3c-02cee169527a-data\") pod \"seaweedfs-86cc847c5c-g7qwk\" (UID: \"08b05a5c-381d-4156-9f3c-02cee169527a\") " pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:49.653498 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.653389 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ee206b0-8bd6-4abb-8f6a-937573561af4-cert\") pod \"kserve-controller-manager-665c47d676-ktbsh\" (UID: \"6ee206b0-8bd6-4abb-8f6a-937573561af4\") " pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:49.653498 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.653482 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mg44\" (UniqueName: \"kubernetes.io/projected/6ee206b0-8bd6-4abb-8f6a-937573561af4-kube-api-access-2mg44\") pod \"kserve-controller-manager-665c47d676-ktbsh\" (UID: \"6ee206b0-8bd6-4abb-8f6a-937573561af4\") " pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:49.653569 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.653525 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mt4\" (UniqueName: \"kubernetes.io/projected/08b05a5c-381d-4156-9f3c-02cee169527a-kube-api-access-w4mt4\") pod \"seaweedfs-86cc847c5c-g7qwk\" (UID: \"08b05a5c-381d-4156-9f3c-02cee169527a\") " pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:49.754576 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.754532 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ee206b0-8bd6-4abb-8f6a-937573561af4-cert\") pod \"kserve-controller-manager-665c47d676-ktbsh\" (UID: \"6ee206b0-8bd6-4abb-8f6a-937573561af4\") " pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:49.754738 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.754610 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mg44\" (UniqueName: \"kubernetes.io/projected/6ee206b0-8bd6-4abb-8f6a-937573561af4-kube-api-access-2mg44\") pod \"kserve-controller-manager-665c47d676-ktbsh\" (UID: \"6ee206b0-8bd6-4abb-8f6a-937573561af4\") " pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:49.754738 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.754638 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mt4\" (UniqueName: \"kubernetes.io/projected/08b05a5c-381d-4156-9f3c-02cee169527a-kube-api-access-w4mt4\") pod \"seaweedfs-86cc847c5c-g7qwk\" (UID: \"08b05a5c-381d-4156-9f3c-02cee169527a\") " pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:49.754738 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.754662 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/08b05a5c-381d-4156-9f3c-02cee169527a-data\") pod \"seaweedfs-86cc847c5c-g7qwk\" (UID: \"08b05a5c-381d-4156-9f3c-02cee169527a\") " pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:49.755043 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.755019 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/08b05a5c-381d-4156-9f3c-02cee169527a-data\") pod \"seaweedfs-86cc847c5c-g7qwk\" (UID: \"08b05a5c-381d-4156-9f3c-02cee169527a\") " pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:49.757109 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.757081 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ee206b0-8bd6-4abb-8f6a-937573561af4-cert\") pod \"kserve-controller-manager-665c47d676-ktbsh\" (UID: \"6ee206b0-8bd6-4abb-8f6a-937573561af4\") " pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:49.763111 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.763089 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mg44\" (UniqueName: \"kubernetes.io/projected/6ee206b0-8bd6-4abb-8f6a-937573561af4-kube-api-access-2mg44\") pod \"kserve-controller-manager-665c47d676-ktbsh\" (UID: \"6ee206b0-8bd6-4abb-8f6a-937573561af4\") " pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:49.763215 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.763122 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mt4\" (UniqueName: \"kubernetes.io/projected/08b05a5c-381d-4156-9f3c-02cee169527a-kube-api-access-w4mt4\") pod \"seaweedfs-86cc847c5c-g7qwk\" (UID: \"08b05a5c-381d-4156-9f3c-02cee169527a\") " pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:49.874675 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.874561 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:49.905491 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:49.905462 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:50.006669 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:50.006526 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-ktbsh"] Apr 22 20:05:50.009690 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:05:50.009650 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ee206b0_8bd6_4abb_8f6a_937573561af4.slice/crio-024eb01100629eeaeba4614234cecbb422b86ae5fd06301eb90bfd8763feb8c0 WatchSource:0}: Error finding container 024eb01100629eeaeba4614234cecbb422b86ae5fd06301eb90bfd8763feb8c0: Status 404 returned error can't find the container with id 024eb01100629eeaeba4614234cecbb422b86ae5fd06301eb90bfd8763feb8c0 Apr 22 20:05:50.038753 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:50.038722 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-g7qwk"] Apr 22 20:05:50.041439 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:05:50.041410 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08b05a5c_381d_4156_9f3c_02cee169527a.slice/crio-4910f7832c29265c81c530489ed16ee4b84c1a850926b42ae22ce282dc91e10c WatchSource:0}: Error finding container 4910f7832c29265c81c530489ed16ee4b84c1a850926b42ae22ce282dc91e10c: Status 404 returned error can't find the container with id 4910f7832c29265c81c530489ed16ee4b84c1a850926b42ae22ce282dc91e10c Apr 22 20:05:50.472980 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:50.472923 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" event={"ID":"6ee206b0-8bd6-4abb-8f6a-937573561af4","Type":"ContainerStarted","Data":"024eb01100629eeaeba4614234cecbb422b86ae5fd06301eb90bfd8763feb8c0"} Apr 22 20:05:50.474412 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:50.474371 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-g7qwk" event={"ID":"08b05a5c-381d-4156-9f3c-02cee169527a","Type":"ContainerStarted","Data":"4910f7832c29265c81c530489ed16ee4b84c1a850926b42ae22ce282dc91e10c"} Apr 22 20:05:54.488830 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:54.488790 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-g7qwk" event={"ID":"08b05a5c-381d-4156-9f3c-02cee169527a","Type":"ContainerStarted","Data":"c97b32d102b00acc694a328209c79b03f4bd79db6eb893b6227cd4842e683fa3"} Apr 22 20:05:54.489308 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:54.488848 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:05:54.490126 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:54.490103 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" event={"ID":"6ee206b0-8bd6-4abb-8f6a-937573561af4","Type":"ContainerStarted","Data":"320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef"} Apr 22 20:05:54.490283 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:54.490271 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:05:54.505810 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:54.505757 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-g7qwk" podStartSLOduration=1.7723530790000002 podStartE2EDuration="5.505743876s" podCreationTimestamp="2026-04-22 20:05:49 +0000 UTC" firstStartedPulling="2026-04-22 20:05:50.042833481 +0000 UTC m=+497.631324792" lastFinishedPulling="2026-04-22 20:05:53.776224265 +0000 UTC m=+501.364715589" observedRunningTime="2026-04-22 20:05:54.504713047 +0000 UTC m=+502.093204414" watchObservedRunningTime="2026-04-22 20:05:54.505743876 +0000 UTC m=+502.094235204" Apr 22 20:05:54.520677 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:05:54.520624 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" podStartSLOduration=1.828886102 podStartE2EDuration="5.520610421s" podCreationTimestamp="2026-04-22 20:05:49 +0000 UTC" firstStartedPulling="2026-04-22 20:05:50.011465628 +0000 UTC m=+497.599956936" lastFinishedPulling="2026-04-22 20:05:53.703189947 +0000 UTC m=+501.291681255" observedRunningTime="2026-04-22 20:05:54.519545176 +0000 UTC m=+502.108036506" watchObservedRunningTime="2026-04-22 20:05:54.520610421 +0000 UTC m=+502.109101749" Apr 22 20:06:00.495389 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:00.495350 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-g7qwk" Apr 22 20:06:24.716316 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.716281 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-ktbsh"] Apr 22 20:06:24.716879 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.716532 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" podUID="6ee206b0-8bd6-4abb-8f6a-937573561af4" containerName="manager" containerID="cri-o://320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef" gracePeriod=10 Apr 22 20:06:24.721562 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.721535 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:06:24.741481 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.741453 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-665c47d676-5lpr5"] Apr 22 20:06:24.743596 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.743581 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:24.752159 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.752129 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-5lpr5"] Apr 22 20:06:24.758545 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.758519 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwd44\" (UniqueName: \"kubernetes.io/projected/3ced6819-5e53-4bb5-b565-408c9f42f696-kube-api-access-pwd44\") pod \"kserve-controller-manager-665c47d676-5lpr5\" (UID: \"3ced6819-5e53-4bb5-b565-408c9f42f696\") " pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:24.758673 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.758622 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ced6819-5e53-4bb5-b565-408c9f42f696-cert\") pod \"kserve-controller-manager-665c47d676-5lpr5\" (UID: \"3ced6819-5e53-4bb5-b565-408c9f42f696\") " pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:24.859807 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.859761 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwd44\" (UniqueName: \"kubernetes.io/projected/3ced6819-5e53-4bb5-b565-408c9f42f696-kube-api-access-pwd44\") pod \"kserve-controller-manager-665c47d676-5lpr5\" (UID: \"3ced6819-5e53-4bb5-b565-408c9f42f696\") " pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:24.860018 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.859878 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ced6819-5e53-4bb5-b565-408c9f42f696-cert\") pod \"kserve-controller-manager-665c47d676-5lpr5\" (UID: \"3ced6819-5e53-4bb5-b565-408c9f42f696\") " pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:24.862654 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.862616 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ced6819-5e53-4bb5-b565-408c9f42f696-cert\") pod \"kserve-controller-manager-665c47d676-5lpr5\" (UID: \"3ced6819-5e53-4bb5-b565-408c9f42f696\") " pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:24.868994 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.868954 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwd44\" (UniqueName: \"kubernetes.io/projected/3ced6819-5e53-4bb5-b565-408c9f42f696-kube-api-access-pwd44\") pod \"kserve-controller-manager-665c47d676-5lpr5\" (UID: \"3ced6819-5e53-4bb5-b565-408c9f42f696\") " pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:24.955964 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:24.955941 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:06:25.060647 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.060614 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ee206b0-8bd6-4abb-8f6a-937573561af4-cert\") pod \"6ee206b0-8bd6-4abb-8f6a-937573561af4\" (UID: \"6ee206b0-8bd6-4abb-8f6a-937573561af4\") " Apr 22 20:06:25.060827 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.060674 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mg44\" (UniqueName: \"kubernetes.io/projected/6ee206b0-8bd6-4abb-8f6a-937573561af4-kube-api-access-2mg44\") pod \"6ee206b0-8bd6-4abb-8f6a-937573561af4\" (UID: \"6ee206b0-8bd6-4abb-8f6a-937573561af4\") " Apr 22 20:06:25.063044 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.063010 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee206b0-8bd6-4abb-8f6a-937573561af4-cert" (OuterVolumeSpecName: "cert") pod "6ee206b0-8bd6-4abb-8f6a-937573561af4" (UID: "6ee206b0-8bd6-4abb-8f6a-937573561af4"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:06:25.063044 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.063011 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee206b0-8bd6-4abb-8f6a-937573561af4-kube-api-access-2mg44" (OuterVolumeSpecName: "kube-api-access-2mg44") pod "6ee206b0-8bd6-4abb-8f6a-937573561af4" (UID: "6ee206b0-8bd6-4abb-8f6a-937573561af4"). InnerVolumeSpecName "kube-api-access-2mg44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:06:25.108218 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.108191 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:25.161702 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.161674 2548 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ee206b0-8bd6-4abb-8f6a-937573561af4-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:06:25.161702 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.161701 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2mg44\" (UniqueName: \"kubernetes.io/projected/6ee206b0-8bd6-4abb-8f6a-937573561af4-kube-api-access-2mg44\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:06:25.229189 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.229153 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-5lpr5"] Apr 22 20:06:25.232461 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:06:25.232434 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ced6819_5e53_4bb5_b565_408c9f42f696.slice/crio-6c81ef306708e7f10f355ca912eb063ccc5f5e10fa8151f97420098f8b71f4d7 WatchSource:0}: Error finding container 6c81ef306708e7f10f355ca912eb063ccc5f5e10fa8151f97420098f8b71f4d7: Status 404 returned error can't find the container with id 6c81ef306708e7f10f355ca912eb063ccc5f5e10fa8151f97420098f8b71f4d7 Apr 22 20:06:25.596310 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.596284 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-5lpr5" event={"ID":"3ced6819-5e53-4bb5-b565-408c9f42f696","Type":"ContainerStarted","Data":"6c81ef306708e7f10f355ca912eb063ccc5f5e10fa8151f97420098f8b71f4d7"} Apr 22 20:06:25.597288 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.597238 2548 generic.go:358] "Generic (PLEG): container finished" podID="6ee206b0-8bd6-4abb-8f6a-937573561af4" containerID="320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef" exitCode=0 Apr 22 20:06:25.597367 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.597327 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" event={"ID":"6ee206b0-8bd6-4abb-8f6a-937573561af4","Type":"ContainerDied","Data":"320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef"} Apr 22 20:06:25.597367 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.597354 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" event={"ID":"6ee206b0-8bd6-4abb-8f6a-937573561af4","Type":"ContainerDied","Data":"024eb01100629eeaeba4614234cecbb422b86ae5fd06301eb90bfd8763feb8c0"} Apr 22 20:06:25.597434 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.597368 2548 scope.go:117] "RemoveContainer" containerID="320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef" Apr 22 20:06:25.597434 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.597330 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-ktbsh" Apr 22 20:06:25.610470 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.610451 2548 scope.go:117] "RemoveContainer" containerID="320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef" Apr 22 20:06:25.610729 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:06:25.610709 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef\": container with ID starting with 320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef not found: ID does not exist" containerID="320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef" Apr 22 20:06:25.610792 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.610740 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef"} err="failed to get container status \"320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef\": rpc error: code = NotFound desc = could not find container \"320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef\": container with ID starting with 320cfb6c45cc5f2efd2dae138bc56a6ecf3213c9592acc759a86831deba543ef not found: ID does not exist" Apr 22 20:06:25.621482 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.621452 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-ktbsh"] Apr 22 20:06:25.622844 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:25.622823 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-ktbsh"] Apr 22 20:06:26.602969 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:26.602924 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-5lpr5" event={"ID":"3ced6819-5e53-4bb5-b565-408c9f42f696","Type":"ContainerStarted","Data":"b7731a6849255b3d22f8e374acee22d9cdc5342ec6e6bfc0c15278c0849cf90c"} Apr 22 20:06:26.603459 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:26.603009 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:26.619994 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:26.619943 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-665c47d676-5lpr5" podStartSLOduration=2.280585371 podStartE2EDuration="2.619930363s" podCreationTimestamp="2026-04-22 20:06:24 +0000 UTC" firstStartedPulling="2026-04-22 20:06:25.233656635 +0000 UTC m=+532.822147946" lastFinishedPulling="2026-04-22 20:06:25.57300163 +0000 UTC m=+533.161492938" observedRunningTime="2026-04-22 20:06:26.618089718 +0000 UTC m=+534.206581047" watchObservedRunningTime="2026-04-22 20:06:26.619930363 +0000 UTC m=+534.208421736" Apr 22 20:06:26.952509 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:26.952423 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee206b0-8bd6-4abb-8f6a-937573561af4" path="/var/lib/kubelet/pods/6ee206b0-8bd6-4abb-8f6a-937573561af4/volumes" Apr 22 20:06:57.611127 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:57.611094 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-665c47d676-5lpr5" Apr 22 20:06:58.463194 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.463156 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-4sz4d"] Apr 22 20:06:58.463570 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.463554 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ee206b0-8bd6-4abb-8f6a-937573561af4" containerName="manager" Apr 22 20:06:58.463625 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.463572 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee206b0-8bd6-4abb-8f6a-937573561af4" containerName="manager" Apr 22 20:06:58.463659 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.463631 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ee206b0-8bd6-4abb-8f6a-937573561af4" containerName="manager" Apr 22 20:06:58.465423 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.465407 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:06:58.468795 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.468772 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 20:06:58.468795 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.468786 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-87clg\"" Apr 22 20:06:58.476965 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.476935 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-4sz4d"] Apr 22 20:06:58.555581 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.555540 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g86n\" (UniqueName: \"kubernetes.io/projected/7829822e-6695-42b9-a06c-489c7eba9e77-kube-api-access-6g86n\") pod \"model-serving-api-86f7b4b499-4sz4d\" (UID: \"7829822e-6695-42b9-a06c-489c7eba9e77\") " pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:06:58.555765 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.555616 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7829822e-6695-42b9-a06c-489c7eba9e77-tls-certs\") pod \"model-serving-api-86f7b4b499-4sz4d\" (UID: \"7829822e-6695-42b9-a06c-489c7eba9e77\") " pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:06:58.656242 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.656209 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g86n\" (UniqueName: \"kubernetes.io/projected/7829822e-6695-42b9-a06c-489c7eba9e77-kube-api-access-6g86n\") pod \"model-serving-api-86f7b4b499-4sz4d\" (UID: \"7829822e-6695-42b9-a06c-489c7eba9e77\") " pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:06:58.656611 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.656284 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7829822e-6695-42b9-a06c-489c7eba9e77-tls-certs\") pod \"model-serving-api-86f7b4b499-4sz4d\" (UID: \"7829822e-6695-42b9-a06c-489c7eba9e77\") " pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:06:58.658777 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.658756 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7829822e-6695-42b9-a06c-489c7eba9e77-tls-certs\") pod \"model-serving-api-86f7b4b499-4sz4d\" (UID: \"7829822e-6695-42b9-a06c-489c7eba9e77\") " pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:06:58.663958 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.663930 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g86n\" (UniqueName: \"kubernetes.io/projected/7829822e-6695-42b9-a06c-489c7eba9e77-kube-api-access-6g86n\") pod \"model-serving-api-86f7b4b499-4sz4d\" (UID: \"7829822e-6695-42b9-a06c-489c7eba9e77\") " pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:06:58.776585 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.776554 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:06:58.900407 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:58.900382 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-4sz4d"] Apr 22 20:06:58.903053 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:06:58.903025 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7829822e_6695_42b9_a06c_489c7eba9e77.slice/crio-8f181bb30f8bc39fedeec44f8c7d672c42bbe37870eff5e3cefc666e0d279cc3 WatchSource:0}: Error finding container 8f181bb30f8bc39fedeec44f8c7d672c42bbe37870eff5e3cefc666e0d279cc3: Status 404 returned error can't find the container with id 8f181bb30f8bc39fedeec44f8c7d672c42bbe37870eff5e3cefc666e0d279cc3 Apr 22 20:06:59.710774 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:06:59.710737 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-4sz4d" event={"ID":"7829822e-6695-42b9-a06c-489c7eba9e77","Type":"ContainerStarted","Data":"8f181bb30f8bc39fedeec44f8c7d672c42bbe37870eff5e3cefc666e0d279cc3"} Apr 22 20:07:00.715419 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:00.715327 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-4sz4d" event={"ID":"7829822e-6695-42b9-a06c-489c7eba9e77","Type":"ContainerStarted","Data":"454dcae1fdde4a10d8ed8b8be83e90c6d8a715af366687e57a904dcff498a36d"} Apr 22 20:07:00.715825 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:00.715455 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:07:00.734331 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:00.734277 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-4sz4d" podStartSLOduration=1.26518184 podStartE2EDuration="2.73423755s" podCreationTimestamp="2026-04-22 20:06:58 +0000 UTC" firstStartedPulling="2026-04-22 20:06:58.904852674 +0000 UTC m=+566.493343986" lastFinishedPulling="2026-04-22 20:07:00.373908385 +0000 UTC m=+567.962399696" observedRunningTime="2026-04-22 20:07:00.733076067 +0000 UTC m=+568.321567397" watchObservedRunningTime="2026-04-22 20:07:00.73423755 +0000 UTC m=+568.322728880" Apr 22 20:07:02.782474 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.782438 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bbdd8fdbf-27d5v"] Apr 22 20:07:02.784721 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.784701 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.796225 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.796199 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bbdd8fdbf-27d5v"] Apr 22 20:07:02.894145 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.894108 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-service-ca\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.894366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.894151 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9471ff1a-5151-4738-b94b-073f81a2084b-console-serving-cert\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.894366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.894206 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjszt\" (UniqueName: \"kubernetes.io/projected/9471ff1a-5151-4738-b94b-073f81a2084b-kube-api-access-kjszt\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.894366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.894243 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9471ff1a-5151-4738-b94b-073f81a2084b-console-oauth-config\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.894366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.894288 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-console-config\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.894366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.894304 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-trusted-ca-bundle\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.894366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.894320 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-oauth-serving-cert\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.995666 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.995632 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjszt\" (UniqueName: \"kubernetes.io/projected/9471ff1a-5151-4738-b94b-073f81a2084b-kube-api-access-kjszt\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.995854 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.995684 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9471ff1a-5151-4738-b94b-073f81a2084b-console-oauth-config\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.995854 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.995826 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-console-config\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.995975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.995866 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-trusted-ca-bundle\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.995975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.995916 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-oauth-serving-cert\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.996087 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.996030 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-service-ca\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.996087 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.996073 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9471ff1a-5151-4738-b94b-073f81a2084b-console-serving-cert\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.996587 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.996562 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-console-config\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.996897 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.996701 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-oauth-serving-cert\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.996897 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.996782 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-service-ca\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.997322 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.997299 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9471ff1a-5151-4738-b94b-073f81a2084b-trusted-ca-bundle\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.998441 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.998421 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9471ff1a-5151-4738-b94b-073f81a2084b-console-oauth-config\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:02.998596 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:02.998580 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9471ff1a-5151-4738-b94b-073f81a2084b-console-serving-cert\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:03.003744 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:03.003726 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjszt\" (UniqueName: \"kubernetes.io/projected/9471ff1a-5151-4738-b94b-073f81a2084b-kube-api-access-kjszt\") pod \"console-bbdd8fdbf-27d5v\" (UID: \"9471ff1a-5151-4738-b94b-073f81a2084b\") " pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:03.094318 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:03.094207 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:03.223784 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:03.223758 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bbdd8fdbf-27d5v"] Apr 22 20:07:03.226820 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:07:03.226790 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9471ff1a_5151_4738_b94b_073f81a2084b.slice/crio-d2fef16360b552d3c054d533d39d57fd2f4c4ec4fa03a96e41109cf039146152 WatchSource:0}: Error finding container d2fef16360b552d3c054d533d39d57fd2f4c4ec4fa03a96e41109cf039146152: Status 404 returned error can't find the container with id d2fef16360b552d3c054d533d39d57fd2f4c4ec4fa03a96e41109cf039146152 Apr 22 20:07:03.728397 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:03.728354 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bbdd8fdbf-27d5v" event={"ID":"9471ff1a-5151-4738-b94b-073f81a2084b","Type":"ContainerStarted","Data":"59edc229d6b0d42f0fbaac231b7feb36b647bca88b25ed8345897708c4f3a2c9"} Apr 22 20:07:03.728598 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:03.728404 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bbdd8fdbf-27d5v" event={"ID":"9471ff1a-5151-4738-b94b-073f81a2084b","Type":"ContainerStarted","Data":"d2fef16360b552d3c054d533d39d57fd2f4c4ec4fa03a96e41109cf039146152"} Apr 22 20:07:03.750452 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:03.750402 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bbdd8fdbf-27d5v" podStartSLOduration=1.750386909 podStartE2EDuration="1.750386909s" podCreationTimestamp="2026-04-22 20:07:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:07:03.747950672 +0000 UTC m=+571.336441999" watchObservedRunningTime="2026-04-22 20:07:03.750386909 +0000 UTC m=+571.338878238" Apr 22 20:07:11.723076 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:11.723041 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-4sz4d" Apr 22 20:07:13.095390 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:13.095354 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:13.095773 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:13.095408 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:13.100442 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:13.100418 2548 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:13.772813 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:13.772784 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bbdd8fdbf-27d5v" Apr 22 20:07:13.814237 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:13.814203 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59c9c9c9c7-lbd2s"] Apr 22 20:07:32.861127 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:32.861099 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:07:32.861587 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:32.861544 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:07:33.061764 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.061730 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr"] Apr 22 20:07:33.065364 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.065342 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" Apr 22 20:07:33.068041 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.068021 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4jdsb\"" Apr 22 20:07:33.071086 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.071058 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr"] Apr 22 20:07:33.077377 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.077356 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" Apr 22 20:07:33.211542 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.211504 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr"] Apr 22 20:07:33.213821 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:07:33.213788 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5422e60e_979d_4694_9d2a_d5a39b594f24.slice/crio-6fce9d8b8f6c435d972e84024853aa5059bc1e0a41658e51159dfec428181fb4 WatchSource:0}: Error finding container 6fce9d8b8f6c435d972e84024853aa5059bc1e0a41658e51159dfec428181fb4: Status 404 returned error can't find the container with id 6fce9d8b8f6c435d972e84024853aa5059bc1e0a41658e51159dfec428181fb4 Apr 22 20:07:33.248831 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.248802 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82"] Apr 22 20:07:33.253585 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.253566 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:07:33.258487 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.258457 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82"] Apr 22 20:07:33.363443 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.363412 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd40035f-f552-4a36-9a51-3f23cced5fa1-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82\" (UID: \"bd40035f-f552-4a36-9a51-3f23cced5fa1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:07:33.463966 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.463876 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd40035f-f552-4a36-9a51-3f23cced5fa1-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82\" (UID: \"bd40035f-f552-4a36-9a51-3f23cced5fa1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:07:33.464282 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.464234 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd40035f-f552-4a36-9a51-3f23cced5fa1-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82\" (UID: \"bd40035f-f552-4a36-9a51-3f23cced5fa1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:07:33.565438 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.565400 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:07:33.740294 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.740259 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82"] Apr 22 20:07:33.746591 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:07:33.746553 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd40035f_f552_4a36_9a51_3f23cced5fa1.slice/crio-7bd1aaf2ee449041ab26b448eb2f076b2b9a3ba1352916c4a0351f072b89fb19 WatchSource:0}: Error finding container 7bd1aaf2ee449041ab26b448eb2f076b2b9a3ba1352916c4a0351f072b89fb19: Status 404 returned error can't find the container with id 7bd1aaf2ee449041ab26b448eb2f076b2b9a3ba1352916c4a0351f072b89fb19 Apr 22 20:07:33.838011 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.837967 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" event={"ID":"bd40035f-f552-4a36-9a51-3f23cced5fa1","Type":"ContainerStarted","Data":"7bd1aaf2ee449041ab26b448eb2f076b2b9a3ba1352916c4a0351f072b89fb19"} Apr 22 20:07:33.840527 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:33.840493 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" event={"ID":"5422e60e-979d-4694-9d2a-d5a39b594f24","Type":"ContainerStarted","Data":"6fce9d8b8f6c435d972e84024853aa5059bc1e0a41658e51159dfec428181fb4"} Apr 22 20:07:38.834807 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:38.834759 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59c9c9c9c7-lbd2s" podUID="28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" containerName="console" containerID="cri-o://003bcfc280bb0f47627dc526ea53f2d335f18623c564259504b53a414cfd6455" gracePeriod=15 Apr 22 20:07:39.871797 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:39.871754 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59c9c9c9c7-lbd2s_28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48/console/0.log" Apr 22 20:07:39.872330 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:39.871808 2548 generic.go:358] "Generic (PLEG): container finished" podID="28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" containerID="003bcfc280bb0f47627dc526ea53f2d335f18623c564259504b53a414cfd6455" exitCode=2 Apr 22 20:07:39.872330 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:39.871883 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c9c9c9c7-lbd2s" event={"ID":"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48","Type":"ContainerDied","Data":"003bcfc280bb0f47627dc526ea53f2d335f18623c564259504b53a414cfd6455"} Apr 22 20:07:42.052023 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:42.051972 2548 patch_prober.go:28] interesting pod/console-59c9c9c9c7-lbd2s container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.23:8443/health\": context deadline exceeded" start-of-body= Apr 22 20:07:42.052556 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:42.052054 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-59c9c9c9c7-lbd2s" podUID="28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" containerName="console" probeResult="failure" output="Get \"https://10.133.0.23:8443/health\": context deadline exceeded" Apr 22 20:07:43.259819 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.259793 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59c9c9c9c7-lbd2s_28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48/console/0.log" Apr 22 20:07:43.260229 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.259854 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:07:43.368757 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.368717 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-config\") pod \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " Apr 22 20:07:43.368956 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.368770 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-oauth-config\") pod \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " Apr 22 20:07:43.368956 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.368841 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65ncq\" (UniqueName: \"kubernetes.io/projected/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-kube-api-access-65ncq\") pod \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " Apr 22 20:07:43.368956 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.368877 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-oauth-serving-cert\") pod \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " Apr 22 20:07:43.368956 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.368910 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-service-ca\") pod \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " Apr 22 20:07:43.368956 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.368930 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-trusted-ca-bundle\") pod \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " Apr 22 20:07:43.369205 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.368975 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-serving-cert\") pod \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\" (UID: \"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48\") " Apr 22 20:07:43.369293 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.369243 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" (UID: "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:07:43.369578 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.369492 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-service-ca" (OuterVolumeSpecName: "service-ca") pod "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" (UID: "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:07:43.369578 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.369530 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-config" (OuterVolumeSpecName: "console-config") pod "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" (UID: "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:07:43.369925 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.369901 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" (UID: "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:07:43.371863 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.371834 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" (UID: "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:07:43.372042 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.372019 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" (UID: "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:07:43.372120 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.372063 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-kube-api-access-65ncq" (OuterVolumeSpecName: "kube-api-access-65ncq") pod "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" (UID: "28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48"). InnerVolumeSpecName "kube-api-access-65ncq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:07:43.470603 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.470567 2548 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:07:43.470603 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.470601 2548 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-oauth-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:07:43.470603 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.470613 2548 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65ncq\" (UniqueName: \"kubernetes.io/projected/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-kube-api-access-65ncq\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:07:43.470868 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.470622 2548 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-oauth-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:07:43.470868 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.470631 2548 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-service-ca\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:07:43.470868 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.470643 2548 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-trusted-ca-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:07:43.470868 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.470652 2548 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48-console-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:07:43.888106 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.888074 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59c9c9c9c7-lbd2s_28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48/console/0.log" Apr 22 20:07:43.888309 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.888199 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c9c9c9c7-lbd2s" event={"ID":"28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48","Type":"ContainerDied","Data":"8faba7f577ab60e7b69b4ae18dc58a26a90d0678c95dcfb5e077769315d3291e"} Apr 22 20:07:43.888309 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.888209 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c9c9c9c7-lbd2s" Apr 22 20:07:43.888309 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.888274 2548 scope.go:117] "RemoveContainer" containerID="003bcfc280bb0f47627dc526ea53f2d335f18623c564259504b53a414cfd6455" Apr 22 20:07:43.910319 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.910287 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59c9c9c9c7-lbd2s"] Apr 22 20:07:43.914127 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:43.914104 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59c9c9c9c7-lbd2s"] Apr 22 20:07:44.953803 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:44.953708 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" path="/var/lib/kubelet/pods/28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48/volumes" Apr 22 20:07:46.901522 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:46.901484 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" event={"ID":"bd40035f-f552-4a36-9a51-3f23cced5fa1","Type":"ContainerStarted","Data":"89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c"} Apr 22 20:07:46.903645 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:46.903618 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" event={"ID":"5422e60e-979d-4694-9d2a-d5a39b594f24","Type":"ContainerStarted","Data":"0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b"} Apr 22 20:07:46.903813 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:46.903801 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" Apr 22 20:07:46.905100 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:46.905069 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 20:07:46.930299 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:46.930235 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podStartSLOduration=0.87370141 podStartE2EDuration="13.930219867s" podCreationTimestamp="2026-04-22 20:07:33 +0000 UTC" firstStartedPulling="2026-04-22 20:07:33.21570336 +0000 UTC m=+600.804194668" lastFinishedPulling="2026-04-22 20:07:46.272221807 +0000 UTC m=+613.860713125" observedRunningTime="2026-04-22 20:07:46.928691838 +0000 UTC m=+614.517183165" watchObservedRunningTime="2026-04-22 20:07:46.930219867 +0000 UTC m=+614.518711197" Apr 22 20:07:47.907081 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:47.907037 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 20:07:49.914982 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:49.914950 2548 generic.go:358] "Generic (PLEG): container finished" podID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerID="89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c" exitCode=0 Apr 22 20:07:49.915356 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:49.915026 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" event={"ID":"bd40035f-f552-4a36-9a51-3f23cced5fa1","Type":"ContainerDied","Data":"89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c"} Apr 22 20:07:57.907678 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:57.907630 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 20:07:57.945601 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:57.945565 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" event={"ID":"bd40035f-f552-4a36-9a51-3f23cced5fa1","Type":"ContainerStarted","Data":"55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b"} Apr 22 20:07:57.945933 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:57.945909 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:07:57.947235 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:57.947204 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 20:07:57.963031 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:57.962756 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podStartSLOduration=1.752293385 podStartE2EDuration="24.962742815s" podCreationTimestamp="2026-04-22 20:07:33 +0000 UTC" firstStartedPulling="2026-04-22 20:07:33.750213855 +0000 UTC m=+601.338705175" lastFinishedPulling="2026-04-22 20:07:56.960663278 +0000 UTC m=+624.549154605" observedRunningTime="2026-04-22 20:07:57.961911069 +0000 UTC m=+625.550402397" watchObservedRunningTime="2026-04-22 20:07:57.962742815 +0000 UTC m=+625.551234143" Apr 22 20:07:58.949130 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:07:58.949085 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 20:08:07.907474 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:07.907408 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 20:08:08.949616 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:08.949566 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 20:08:17.908010 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:17.907963 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 20:08:18.949127 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:18.949079 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 20:08:27.907936 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:27.907888 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 20:08:28.949428 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:28.949376 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 20:08:37.908451 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:37.908414 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" Apr 22 20:08:38.949617 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:38.949570 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 20:08:48.949529 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:48.949479 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 20:08:58.949299 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:08:58.949257 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 20:09:07.149577 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.149531 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr"] Apr 22 20:09:07.149953 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.149798 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" containerID="cri-o://0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b" gracePeriod=30 Apr 22 20:09:07.183184 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.183149 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d"] Apr 22 20:09:07.183553 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.183540 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" containerName="console" Apr 22 20:09:07.183603 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.183554 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" containerName="console" Apr 22 20:09:07.183636 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.183616 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="28266ac7-5fb8-4cf5-aaaf-0cfd5246cd48" containerName="console" Apr 22 20:09:07.186686 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.186667 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" Apr 22 20:09:07.194342 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.194314 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d"] Apr 22 20:09:07.197667 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.197649 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" Apr 22 20:09:07.328502 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.328478 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d"] Apr 22 20:09:07.331172 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:09:07.331138 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8581ee20_7918_4f69_a314_5845e5ebe269.slice/crio-5184c2319e2eee80499b61f3f1994046c06eb8ca3266f935c4d5a8d18b4cc803 WatchSource:0}: Error finding container 5184c2319e2eee80499b61f3f1994046c06eb8ca3266f935c4d5a8d18b4cc803: Status 404 returned error can't find the container with id 5184c2319e2eee80499b61f3f1994046c06eb8ca3266f935c4d5a8d18b4cc803 Apr 22 20:09:07.333161 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.333138 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:09:07.908082 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:07.908040 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 20:09:08.197440 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:08.197349 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" event={"ID":"8581ee20-7918-4f69-a314-5845e5ebe269","Type":"ContainerStarted","Data":"8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606"} Apr 22 20:09:08.197440 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:08.197387 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" event={"ID":"8581ee20-7918-4f69-a314-5845e5ebe269","Type":"ContainerStarted","Data":"5184c2319e2eee80499b61f3f1994046c06eb8ca3266f935c4d5a8d18b4cc803"} Apr 22 20:09:08.197844 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:08.197549 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" Apr 22 20:09:08.198883 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:08.198842 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 20:09:08.211934 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:08.211885 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" podStartSLOduration=1.211869981 podStartE2EDuration="1.211869981s" podCreationTimestamp="2026-04-22 20:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:09:08.210405167 +0000 UTC m=+695.798896497" watchObservedRunningTime="2026-04-22 20:09:08.211869981 +0000 UTC m=+695.800361309" Apr 22 20:09:08.952628 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:08.952598 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:09:09.201497 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:09.201463 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 20:09:10.401497 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:10.401475 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" Apr 22 20:09:11.209106 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.209065 2548 generic.go:358] "Generic (PLEG): container finished" podID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerID="0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b" exitCode=0 Apr 22 20:09:11.209313 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.209158 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" Apr 22 20:09:11.209313 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.209153 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" event={"ID":"5422e60e-979d-4694-9d2a-d5a39b594f24","Type":"ContainerDied","Data":"0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b"} Apr 22 20:09:11.209313 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.209202 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr" event={"ID":"5422e60e-979d-4694-9d2a-d5a39b594f24","Type":"ContainerDied","Data":"6fce9d8b8f6c435d972e84024853aa5059bc1e0a41658e51159dfec428181fb4"} Apr 22 20:09:11.209313 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.209223 2548 scope.go:117] "RemoveContainer" containerID="0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b" Apr 22 20:09:11.217459 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.217432 2548 scope.go:117] "RemoveContainer" containerID="0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b" Apr 22 20:09:11.217737 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:09:11.217718 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b\": container with ID starting with 0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b not found: ID does not exist" containerID="0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b" Apr 22 20:09:11.217796 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.217746 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b"} err="failed to get container status \"0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b\": rpc error: code = NotFound desc = could not find container \"0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b\": container with ID starting with 0bb674f89c608a689e9019370a2079dce0e0d027fd2ea76c660aa9371ffac38b not found: ID does not exist" Apr 22 20:09:11.225340 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.225303 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr"] Apr 22 20:09:11.227237 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:11.227213 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bee44-predictor-77c6ff6f7f-bvkwr"] Apr 22 20:09:12.952453 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:12.952411 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" path="/var/lib/kubelet/pods/5422e60e-979d-4694-9d2a-d5a39b594f24/volumes" Apr 22 20:09:19.202182 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:19.202137 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 20:09:29.201854 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:29.201811 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 20:09:39.201925 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:39.201880 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 20:09:42.978481 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:42.978445 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82"] Apr 22 20:09:42.978933 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:42.978718 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" containerID="cri-o://55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b" gracePeriod=30 Apr 22 20:09:43.080018 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.079980 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt"] Apr 22 20:09:43.080374 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.080359 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" Apr 22 20:09:43.080374 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.080377 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" Apr 22 20:09:43.080493 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.080482 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="5422e60e-979d-4694-9d2a-d5a39b594f24" containerName="kserve-container" Apr 22 20:09:43.084954 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.084936 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" Apr 22 20:09:43.089306 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.089274 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt"] Apr 22 20:09:43.096307 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.096280 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" Apr 22 20:09:43.228920 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.228844 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt"] Apr 22 20:09:43.232653 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:09:43.232626 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd29f44a_5537_4135_ad63_c57457c18066.slice/crio-b02f0c05342bb60e023eca0c6c46b5ffd812e9cd20753cf794b9f5010c8ebcc4 WatchSource:0}: Error finding container b02f0c05342bb60e023eca0c6c46b5ffd812e9cd20753cf794b9f5010c8ebcc4: Status 404 returned error can't find the container with id b02f0c05342bb60e023eca0c6c46b5ffd812e9cd20753cf794b9f5010c8ebcc4 Apr 22 20:09:43.329231 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:43.329186 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" event={"ID":"fd29f44a-5537-4135-ad63-c57457c18066","Type":"ContainerStarted","Data":"b02f0c05342bb60e023eca0c6c46b5ffd812e9cd20753cf794b9f5010c8ebcc4"} Apr 22 20:09:44.334321 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:44.334281 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" event={"ID":"fd29f44a-5537-4135-ad63-c57457c18066","Type":"ContainerStarted","Data":"fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e"} Apr 22 20:09:44.334732 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:44.334501 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" Apr 22 20:09:44.335850 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:44.335827 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 20:09:44.348989 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:44.348926 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" podStartSLOduration=1.34890903 podStartE2EDuration="1.34890903s" podCreationTimestamp="2026-04-22 20:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:09:44.346667941 +0000 UTC m=+731.935159270" watchObservedRunningTime="2026-04-22 20:09:44.34890903 +0000 UTC m=+731.937400361" Apr 22 20:09:45.338719 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:45.338672 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 20:09:47.722877 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:47.722855 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:09:47.758130 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:47.758097 2548 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd40035f-f552-4a36-9a51-3f23cced5fa1-kserve-provision-location\") pod \"bd40035f-f552-4a36-9a51-3f23cced5fa1\" (UID: \"bd40035f-f552-4a36-9a51-3f23cced5fa1\") " Apr 22 20:09:47.758459 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:47.758435 2548 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd40035f-f552-4a36-9a51-3f23cced5fa1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bd40035f-f552-4a36-9a51-3f23cced5fa1" (UID: "bd40035f-f552-4a36-9a51-3f23cced5fa1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:47.859265 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:47.859160 2548 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd40035f-f552-4a36-9a51-3f23cced5fa1-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 20:09:48.350679 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.350644 2548 generic.go:358] "Generic (PLEG): container finished" podID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerID="55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b" exitCode=0 Apr 22 20:09:48.350891 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.350694 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" event={"ID":"bd40035f-f552-4a36-9a51-3f23cced5fa1","Type":"ContainerDied","Data":"55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b"} Apr 22 20:09:48.350891 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.350718 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" event={"ID":"bd40035f-f552-4a36-9a51-3f23cced5fa1","Type":"ContainerDied","Data":"7bd1aaf2ee449041ab26b448eb2f076b2b9a3ba1352916c4a0351f072b89fb19"} Apr 22 20:09:48.350891 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.350732 2548 scope.go:117] "RemoveContainer" containerID="55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b" Apr 22 20:09:48.350891 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.350734 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82" Apr 22 20:09:48.359803 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.359783 2548 scope.go:117] "RemoveContainer" containerID="89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c" Apr 22 20:09:48.367654 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.367635 2548 scope.go:117] "RemoveContainer" containerID="55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b" Apr 22 20:09:48.367917 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:09:48.367894 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b\": container with ID starting with 55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b not found: ID does not exist" containerID="55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b" Apr 22 20:09:48.368014 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.367925 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b"} err="failed to get container status \"55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b\": rpc error: code = NotFound desc = could not find container \"55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b\": container with ID starting with 55f1b4129f4c7e0521c3bb16f858db3291b5b9e32bbbc2bddf7b27d275d35e9b not found: ID does not exist" Apr 22 20:09:48.368014 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.367945 2548 scope.go:117] "RemoveContainer" containerID="89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c" Apr 22 20:09:48.368212 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:09:48.368195 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c\": container with ID starting with 89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c not found: ID does not exist" containerID="89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c" Apr 22 20:09:48.368283 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.368219 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c"} err="failed to get container status \"89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c\": rpc error: code = NotFound desc = could not find container \"89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c\": container with ID starting with 89ef0243b26a40f75cd4c16780295cea2434ba1b44ace2224c4bfe0595f5be7c not found: ID does not exist" Apr 22 20:09:48.371774 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.371753 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82"] Apr 22 20:09:48.374351 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.374327 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-8f468c9d4-gzq82"] Apr 22 20:09:48.957988 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:48.957953 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" path="/var/lib/kubelet/pods/bd40035f-f552-4a36-9a51-3f23cced5fa1/volumes" Apr 22 20:09:49.202415 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:49.202366 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 20:09:55.339608 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:55.339560 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 20:09:59.203163 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:09:59.203134 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" Apr 22 20:10:05.339191 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:10:05.339138 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 20:10:15.339739 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:10:15.339684 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 20:10:25.338780 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:10:25.338734 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 20:10:35.340442 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:10:35.340407 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" Apr 22 20:12:32.888113 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:12:32.888080 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:12:32.888994 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:12:32.888972 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:17:32.918377 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:17:32.918343 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:17:32.920480 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:17:32.920458 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:18:32.042368 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.042334 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d"] Apr 22 20:18:32.042890 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.042559 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" containerID="cri-o://8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606" gracePeriod=30 Apr 22 20:18:32.101689 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.101659 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79"] Apr 22 20:18:32.102035 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.102022 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" Apr 22 20:18:32.102078 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.102037 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" Apr 22 20:18:32.102078 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.102048 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="storage-initializer" Apr 22 20:18:32.102078 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.102054 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="storage-initializer" Apr 22 20:18:32.102170 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.102109 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd40035f-f552-4a36-9a51-3f23cced5fa1" containerName="kserve-container" Apr 22 20:18:32.105025 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.105005 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" Apr 22 20:18:32.113528 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.113503 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79"] Apr 22 20:18:32.115212 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.115194 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" Apr 22 20:18:32.247187 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.247150 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79"] Apr 22 20:18:32.250452 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:18:32.250413 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04314805_4e69_4a7e_aa64_54431bdca2d1.slice/crio-72a0c8c4c21b711ab97a344525803ff65c5bffc8d46924704f27561b490e265c WatchSource:0}: Error finding container 72a0c8c4c21b711ab97a344525803ff65c5bffc8d46924704f27561b490e265c: Status 404 returned error can't find the container with id 72a0c8c4c21b711ab97a344525803ff65c5bffc8d46924704f27561b490e265c Apr 22 20:18:32.255954 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:32.255931 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:18:33.163181 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:33.163142 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" event={"ID":"04314805-4e69-4a7e-aa64-54431bdca2d1","Type":"ContainerStarted","Data":"677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425"} Apr 22 20:18:33.163181 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:33.163181 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" event={"ID":"04314805-4e69-4a7e-aa64-54431bdca2d1","Type":"ContainerStarted","Data":"72a0c8c4c21b711ab97a344525803ff65c5bffc8d46924704f27561b490e265c"} Apr 22 20:18:33.163668 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:33.163294 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" Apr 22 20:18:33.164828 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:33.164801 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:18:33.177055 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:33.177004 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podStartSLOduration=1.176989189 podStartE2EDuration="1.176989189s" podCreationTimestamp="2026-04-22 20:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:18:33.176306569 +0000 UTC m=+1260.764797910" watchObservedRunningTime="2026-04-22 20:18:33.176989189 +0000 UTC m=+1260.765480519" Apr 22 20:18:34.166537 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:34.166499 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:18:35.391644 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:35.391620 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" Apr 22 20:18:36.173524 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.173485 2548 generic.go:358] "Generic (PLEG): container finished" podID="8581ee20-7918-4f69-a314-5845e5ebe269" containerID="8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606" exitCode=0 Apr 22 20:18:36.173718 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.173562 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" event={"ID":"8581ee20-7918-4f69-a314-5845e5ebe269","Type":"ContainerDied","Data":"8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606"} Apr 22 20:18:36.173718 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.173602 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" event={"ID":"8581ee20-7918-4f69-a314-5845e5ebe269","Type":"ContainerDied","Data":"5184c2319e2eee80499b61f3f1994046c06eb8ca3266f935c4d5a8d18b4cc803"} Apr 22 20:18:36.173718 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.173619 2548 scope.go:117] "RemoveContainer" containerID="8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606" Apr 22 20:18:36.173718 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.173573 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d" Apr 22 20:18:36.182204 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.182190 2548 scope.go:117] "RemoveContainer" containerID="8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606" Apr 22 20:18:36.182542 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:18:36.182519 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606\": container with ID starting with 8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606 not found: ID does not exist" containerID="8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606" Apr 22 20:18:36.182593 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.182554 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606"} err="failed to get container status \"8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606\": rpc error: code = NotFound desc = could not find container \"8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606\": container with ID starting with 8c6a55fa69f1c7c9f0a22cd4f26c0b36697710ed6bb136bca558159d0e115606 not found: ID does not exist" Apr 22 20:18:36.195616 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.195587 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d"] Apr 22 20:18:36.198892 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.198869 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-51a0f-predictor-798d4666d9-p9f2d"] Apr 22 20:18:36.953700 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:36.953667 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" path="/var/lib/kubelet/pods/8581ee20-7918-4f69-a314-5845e5ebe269/volumes" Apr 22 20:18:44.166761 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:44.166705 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:18:54.167073 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:18:54.167021 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:19:04.167366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:04.167317 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:19:07.861136 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.861088 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r"] Apr 22 20:19:07.861608 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.861501 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" Apr 22 20:19:07.861608 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.861514 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" Apr 22 20:19:07.861608 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.861572 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="8581ee20-7918-4f69-a314-5845e5ebe269" containerName="kserve-container" Apr 22 20:19:07.864702 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.864681 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" Apr 22 20:19:07.874611 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.874580 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r"] Apr 22 20:19:07.877031 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.877010 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" Apr 22 20:19:07.879852 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.879828 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt"] Apr 22 20:19:07.880145 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:07.880118 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" containerID="cri-o://fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e" gracePeriod=30 Apr 22 20:19:08.009769 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:08.009742 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r"] Apr 22 20:19:08.012454 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:19:08.012421 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd746f546_fcd1_4773_b3f5_4b7707179cb3.slice/crio-050983cf0f84c0b7ba291f2bf1e00bcd14ce6e77eaae914d29f6bed8328385ab WatchSource:0}: Error finding container 050983cf0f84c0b7ba291f2bf1e00bcd14ce6e77eaae914d29f6bed8328385ab: Status 404 returned error can't find the container with id 050983cf0f84c0b7ba291f2bf1e00bcd14ce6e77eaae914d29f6bed8328385ab Apr 22 20:19:08.286825 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:08.286782 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" event={"ID":"d746f546-fcd1-4773-b3f5-4b7707179cb3","Type":"ContainerStarted","Data":"a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7"} Apr 22 20:19:08.286825 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:08.286818 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" event={"ID":"d746f546-fcd1-4773-b3f5-4b7707179cb3","Type":"ContainerStarted","Data":"050983cf0f84c0b7ba291f2bf1e00bcd14ce6e77eaae914d29f6bed8328385ab"} Apr 22 20:19:08.287056 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:08.286924 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" Apr 22 20:19:08.288386 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:08.288357 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:19:08.301534 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:08.301491 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podStartSLOduration=1.3014763010000001 podStartE2EDuration="1.301476301s" podCreationTimestamp="2026-04-22 20:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:19:08.300202347 +0000 UTC m=+1295.888693676" watchObservedRunningTime="2026-04-22 20:19:08.301476301 +0000 UTC m=+1295.889967697" Apr 22 20:19:09.290856 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:09.290811 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:19:11.229790 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.229765 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" Apr 22 20:19:11.298035 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.298000 2548 generic.go:358] "Generic (PLEG): container finished" podID="fd29f44a-5537-4135-ad63-c57457c18066" containerID="fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e" exitCode=0 Apr 22 20:19:11.298201 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.298062 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" event={"ID":"fd29f44a-5537-4135-ad63-c57457c18066","Type":"ContainerDied","Data":"fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e"} Apr 22 20:19:11.298201 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.298070 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" Apr 22 20:19:11.298201 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.298090 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt" event={"ID":"fd29f44a-5537-4135-ad63-c57457c18066","Type":"ContainerDied","Data":"b02f0c05342bb60e023eca0c6c46b5ffd812e9cd20753cf794b9f5010c8ebcc4"} Apr 22 20:19:11.298201 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.298105 2548 scope.go:117] "RemoveContainer" containerID="fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e" Apr 22 20:19:11.306904 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.306884 2548 scope.go:117] "RemoveContainer" containerID="fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e" Apr 22 20:19:11.307180 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:19:11.307156 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e\": container with ID starting with fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e not found: ID does not exist" containerID="fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e" Apr 22 20:19:11.307235 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.307183 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e"} err="failed to get container status \"fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e\": rpc error: code = NotFound desc = could not find container \"fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e\": container with ID starting with fe43df0b2929a3b8351f3ec59fc398d1ea4636421e4952eddd52d0355885956e not found: ID does not exist" Apr 22 20:19:11.319003 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.318952 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt"] Apr 22 20:19:11.320625 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:11.320604 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cbc99-predictor-758d77d98c-br4zt"] Apr 22 20:19:12.953376 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:12.953337 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd29f44a-5537-4135-ad63-c57457c18066" path="/var/lib/kubelet/pods/fd29f44a-5537-4135-ad63-c57457c18066/volumes" Apr 22 20:19:14.167141 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:14.167103 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:19:19.291022 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:19.290972 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:19:24.168366 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:24.168336 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" Apr 22 20:19:29.291570 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:29.291523 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:19:39.290994 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:39.290945 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:19:49.291810 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:49.291707 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:19:52.377507 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.377469 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79"] Apr 22 20:19:52.378065 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.377784 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" containerID="cri-o://677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425" gracePeriod=30 Apr 22 20:19:52.408768 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.408723 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr"] Apr 22 20:19:52.409135 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.409121 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" Apr 22 20:19:52.409176 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.409140 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" Apr 22 20:19:52.409239 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.409229 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd29f44a-5537-4135-ad63-c57457c18066" containerName="kserve-container" Apr 22 20:19:52.413681 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.413661 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" Apr 22 20:19:52.429435 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.429383 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr"] Apr 22 20:19:52.432176 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.432150 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" Apr 22 20:19:52.585612 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:52.585520 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr"] Apr 22 20:19:52.588551 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:19:52.588512 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd82de427_4208_414d_aaa6_6d36b423e599.slice/crio-2805c85442b9ee6b52cbf23159903c9add23baadbe52eec4ff359c03b6ac11c3 WatchSource:0}: Error finding container 2805c85442b9ee6b52cbf23159903c9add23baadbe52eec4ff359c03b6ac11c3: Status 404 returned error can't find the container with id 2805c85442b9ee6b52cbf23159903c9add23baadbe52eec4ff359c03b6ac11c3 Apr 22 20:19:53.441913 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:53.441871 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" event={"ID":"d82de427-4208-414d-aaa6-6d36b423e599","Type":"ContainerStarted","Data":"235a0db8ca240fcb8a24a1f3d0e472668a39b22a7598d35199b2772858688a03"} Apr 22 20:19:53.441913 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:53.441911 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" event={"ID":"d82de427-4208-414d-aaa6-6d36b423e599","Type":"ContainerStarted","Data":"2805c85442b9ee6b52cbf23159903c9add23baadbe52eec4ff359c03b6ac11c3"} Apr 22 20:19:53.442391 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:53.442045 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" Apr 22 20:19:53.443497 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:53.443470 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:19:53.457560 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:53.457513 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" podStartSLOduration=1.457500032 podStartE2EDuration="1.457500032s" podCreationTimestamp="2026-04-22 20:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:19:53.455292389 +0000 UTC m=+1341.043783720" watchObservedRunningTime="2026-04-22 20:19:53.457500032 +0000 UTC m=+1341.045991361" Apr 22 20:19:54.167634 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:54.167586 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 22 20:19:54.445458 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:54.445370 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:19:56.128640 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.128616 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" Apr 22 20:19:56.457176 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.457085 2548 generic.go:358] "Generic (PLEG): container finished" podID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerID="677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425" exitCode=0 Apr 22 20:19:56.457176 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.457153 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" event={"ID":"04314805-4e69-4a7e-aa64-54431bdca2d1","Type":"ContainerDied","Data":"677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425"} Apr 22 20:19:56.457424 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.457174 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" Apr 22 20:19:56.457424 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.457189 2548 scope.go:117] "RemoveContainer" containerID="677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425" Apr 22 20:19:56.457424 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.457179 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79" event={"ID":"04314805-4e69-4a7e-aa64-54431bdca2d1","Type":"ContainerDied","Data":"72a0c8c4c21b711ab97a344525803ff65c5bffc8d46924704f27561b490e265c"} Apr 22 20:19:56.465472 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.465450 2548 scope.go:117] "RemoveContainer" containerID="677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425" Apr 22 20:19:56.465743 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:19:56.465725 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425\": container with ID starting with 677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425 not found: ID does not exist" containerID="677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425" Apr 22 20:19:56.465805 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.465759 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425"} err="failed to get container status \"677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425\": rpc error: code = NotFound desc = could not find container \"677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425\": container with ID starting with 677b2df7fa9eb186a4e125084749a4d039581fa5750fa52c969cf9ed8c7bc425 not found: ID does not exist" Apr 22 20:19:56.477127 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.477102 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79"] Apr 22 20:19:56.483794 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.479216 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb62f-predictor-56b9b76457-hzh79"] Apr 22 20:19:56.953385 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:56.953350 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" path="/var/lib/kubelet/pods/04314805-4e69-4a7e-aa64-54431bdca2d1/volumes" Apr 22 20:19:59.292172 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:19:59.292137 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" Apr 22 20:20:04.446141 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:04.446095 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:20:14.446185 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:14.446132 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:20:24.445572 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:24.445520 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:20:28.090602 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.090566 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r"] Apr 22 20:20:28.091054 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.090796 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" containerID="cri-o://a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7" gracePeriod=30 Apr 22 20:20:28.110997 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.110967 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk"] Apr 22 20:20:28.111378 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.111364 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" Apr 22 20:20:28.111378 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.111380 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" Apr 22 20:20:28.111477 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.111457 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="04314805-4e69-4a7e-aa64-54431bdca2d1" containerName="kserve-container" Apr 22 20:20:28.114421 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.114402 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" Apr 22 20:20:28.120847 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.120819 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk"] Apr 22 20:20:28.126608 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.126578 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" Apr 22 20:20:28.256665 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.256629 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk"] Apr 22 20:20:28.260766 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:20:28.260733 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b1b2e8_ca98_481c_bb6c_d146b4f21d26.slice/crio-fa67df5513c36dc493f67f03dd64ea6d75da6b069224b776bc6514012e1ee586 WatchSource:0}: Error finding container fa67df5513c36dc493f67f03dd64ea6d75da6b069224b776bc6514012e1ee586: Status 404 returned error can't find the container with id fa67df5513c36dc493f67f03dd64ea6d75da6b069224b776bc6514012e1ee586 Apr 22 20:20:28.567612 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.567578 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" event={"ID":"39b1b2e8-ca98-481c-bb6c-d146b4f21d26","Type":"ContainerStarted","Data":"ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066"} Apr 22 20:20:28.567612 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.567618 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" event={"ID":"39b1b2e8-ca98-481c-bb6c-d146b4f21d26","Type":"ContainerStarted","Data":"fa67df5513c36dc493f67f03dd64ea6d75da6b069224b776bc6514012e1ee586"} Apr 22 20:20:28.567875 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.567742 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" Apr 22 20:20:28.569207 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.569179 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:20:28.583551 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:28.583505 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" podStartSLOduration=0.583491748 podStartE2EDuration="583.491748ms" podCreationTimestamp="2026-04-22 20:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:20:28.581122504 +0000 UTC m=+1376.169613829" watchObservedRunningTime="2026-04-22 20:20:28.583491748 +0000 UTC m=+1376.171983071" Apr 22 20:20:29.291167 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:29.291124 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 22 20:20:29.571120 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:29.571027 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:20:31.439764 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.439738 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" Apr 22 20:20:31.579371 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.579283 2548 generic.go:358] "Generic (PLEG): container finished" podID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerID="a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7" exitCode=0 Apr 22 20:20:31.579371 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.579354 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" event={"ID":"d746f546-fcd1-4773-b3f5-4b7707179cb3","Type":"ContainerDied","Data":"a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7"} Apr 22 20:20:31.579595 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.579377 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" event={"ID":"d746f546-fcd1-4773-b3f5-4b7707179cb3","Type":"ContainerDied","Data":"050983cf0f84c0b7ba291f2bf1e00bcd14ce6e77eaae914d29f6bed8328385ab"} Apr 22 20:20:31.579595 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.579393 2548 scope.go:117] "RemoveContainer" containerID="a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7" Apr 22 20:20:31.579595 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.579418 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r" Apr 22 20:20:31.588056 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.588039 2548 scope.go:117] "RemoveContainer" containerID="a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7" Apr 22 20:20:31.588323 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:20:31.588303 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7\": container with ID starting with a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7 not found: ID does not exist" containerID="a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7" Apr 22 20:20:31.588374 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.588332 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7"} err="failed to get container status \"a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7\": rpc error: code = NotFound desc = could not find container \"a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7\": container with ID starting with a928dab40020354f609490f48bceb80996b96a3ae1be0fe8ddfac9b7010490e7 not found: ID does not exist" Apr 22 20:20:31.598873 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.598845 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r"] Apr 22 20:20:31.602422 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:31.602399 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-03046-predictor-768fcd7df-mvg4r"] Apr 22 20:20:32.954808 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:32.954768 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" path="/var/lib/kubelet/pods/d746f546-fcd1-4773-b3f5-4b7707179cb3/volumes" Apr 22 20:20:34.446377 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:34.446338 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 20:20:39.571365 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:39.571317 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:20:44.447465 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:44.447423 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" Apr 22 20:20:49.572056 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:49.572012 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:20:59.571410 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:20:59.571368 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:21:09.571521 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:21:09.571476 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 20:21:19.572955 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:21:19.572923 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" Apr 22 20:22:32.941297 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:22:32.941241 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:22:32.944202 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:22:32.944180 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:27:32.966507 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:27:32.966478 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:27:32.976505 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:27:32.976480 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:29:17.261418 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.261381 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr"] Apr 22 20:29:17.261938 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.261771 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" containerID="cri-o://235a0db8ca240fcb8a24a1f3d0e472668a39b22a7598d35199b2772858688a03" gracePeriod=30 Apr 22 20:29:17.329684 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.329650 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk"] Apr 22 20:29:17.330034 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.330022 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" Apr 22 20:29:17.330090 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.330037 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" Apr 22 20:29:17.330090 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.330087 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d746f546-fcd1-4773-b3f5-4b7707179cb3" containerName="kserve-container" Apr 22 20:29:17.333002 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.332982 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" Apr 22 20:29:17.339536 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.339498 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk"] Apr 22 20:29:17.344190 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.344173 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" Apr 22 20:29:17.477301 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.477277 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk"] Apr 22 20:29:17.479653 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:29:17.479622 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f27af2_f92d_4344_aa45_d7d7375d18c0.slice/crio-f1634f4ce7c4981f1803416b0c5f47f0d7ab003172bdcd911f2d0ff64fd395f3 WatchSource:0}: Error finding container f1634f4ce7c4981f1803416b0c5f47f0d7ab003172bdcd911f2d0ff64fd395f3: Status 404 returned error can't find the container with id f1634f4ce7c4981f1803416b0c5f47f0d7ab003172bdcd911f2d0ff64fd395f3 Apr 22 20:29:17.481509 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:17.481490 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:29:18.388975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:18.388935 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" event={"ID":"33f27af2-f92d-4344-aa45-d7d7375d18c0","Type":"ContainerStarted","Data":"1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff"} Apr 22 20:29:18.388975 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:18.388976 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" event={"ID":"33f27af2-f92d-4344-aa45-d7d7375d18c0","Type":"ContainerStarted","Data":"f1634f4ce7c4981f1803416b0c5f47f0d7ab003172bdcd911f2d0ff64fd395f3"} Apr 22 20:29:18.389453 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:18.389200 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" Apr 22 20:29:18.390527 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:18.390497 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:29:18.404315 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:18.404240 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podStartSLOduration=1.40422706 podStartE2EDuration="1.40422706s" podCreationTimestamp="2026-04-22 20:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:29:18.402694029 +0000 UTC m=+1905.991185355" watchObservedRunningTime="2026-04-22 20:29:18.40422706 +0000 UTC m=+1905.992718388" Apr 22 20:29:19.394149 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:19.394085 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:29:20.399477 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:20.399441 2548 generic.go:358] "Generic (PLEG): container finished" podID="d82de427-4208-414d-aaa6-6d36b423e599" containerID="235a0db8ca240fcb8a24a1f3d0e472668a39b22a7598d35199b2772858688a03" exitCode=0 Apr 22 20:29:20.399831 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:20.399519 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" event={"ID":"d82de427-4208-414d-aaa6-6d36b423e599","Type":"ContainerDied","Data":"235a0db8ca240fcb8a24a1f3d0e472668a39b22a7598d35199b2772858688a03"} Apr 22 20:29:20.511587 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:20.511564 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" Apr 22 20:29:21.404286 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:21.404241 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" Apr 22 20:29:21.404286 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:21.404273 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr" event={"ID":"d82de427-4208-414d-aaa6-6d36b423e599","Type":"ContainerDied","Data":"2805c85442b9ee6b52cbf23159903c9add23baadbe52eec4ff359c03b6ac11c3"} Apr 22 20:29:21.404794 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:21.404314 2548 scope.go:117] "RemoveContainer" containerID="235a0db8ca240fcb8a24a1f3d0e472668a39b22a7598d35199b2772858688a03" Apr 22 20:29:21.420305 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:21.420280 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr"] Apr 22 20:29:21.424410 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:21.424383 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08e8b-predictor-7cdfb7f54f-2tdgr"] Apr 22 20:29:22.955491 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:22.955458 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82de427-4208-414d-aaa6-6d36b423e599" path="/var/lib/kubelet/pods/d82de427-4208-414d-aaa6-6d36b423e599/volumes" Apr 22 20:29:29.394878 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:29.394831 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:29:39.394578 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:39.394527 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:29:49.395134 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:49.395086 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:29:52.975212 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:52.975177 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk"] Apr 22 20:29:52.975667 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:52.975409 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" containerID="cri-o://ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066" gracePeriod=30 Apr 22 20:29:53.006016 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.005983 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6"] Apr 22 20:29:53.006444 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.006425 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" Apr 22 20:29:53.006536 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.006446 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" Apr 22 20:29:53.006597 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.006547 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="d82de427-4208-414d-aaa6-6d36b423e599" containerName="kserve-container" Apr 22 20:29:53.010546 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.010524 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" Apr 22 20:29:53.019916 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.019888 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6"] Apr 22 20:29:53.022022 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.022000 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" Apr 22 20:29:53.161263 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.161224 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6"] Apr 22 20:29:53.164189 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:29:53.164156 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660a84d3_b2fe_4452_b05f_31a77de208ff.slice/crio-ed7671eb8f06289bfcefd146bc763c8a7bbb17790e16b1ff7d972e527c2d5e4e WatchSource:0}: Error finding container ed7671eb8f06289bfcefd146bc763c8a7bbb17790e16b1ff7d972e527c2d5e4e: Status 404 returned error can't find the container with id ed7671eb8f06289bfcefd146bc763c8a7bbb17790e16b1ff7d972e527c2d5e4e Apr 22 20:29:53.513040 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.512950 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" event={"ID":"660a84d3-b2fe-4452-b05f-31a77de208ff","Type":"ContainerStarted","Data":"c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3"} Apr 22 20:29:53.513040 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.512990 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" event={"ID":"660a84d3-b2fe-4452-b05f-31a77de208ff","Type":"ContainerStarted","Data":"ed7671eb8f06289bfcefd146bc763c8a7bbb17790e16b1ff7d972e527c2d5e4e"} Apr 22 20:29:53.513235 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.513139 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" Apr 22 20:29:53.514608 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.514580 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:29:53.529172 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:53.529125 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podStartSLOduration=1.529112576 podStartE2EDuration="1.529112576s" podCreationTimestamp="2026-04-22 20:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:29:53.528365246 +0000 UTC m=+1941.116856575" watchObservedRunningTime="2026-04-22 20:29:53.529112576 +0000 UTC m=+1941.117603904" Apr 22 20:29:54.517161 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:54.517118 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:29:56.325412 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.325387 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" Apr 22 20:29:56.523578 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.523535 2548 generic.go:358] "Generic (PLEG): container finished" podID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerID="ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066" exitCode=0 Apr 22 20:29:56.523763 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.523607 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" event={"ID":"39b1b2e8-ca98-481c-bb6c-d146b4f21d26","Type":"ContainerDied","Data":"ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066"} Apr 22 20:29:56.523763 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.523630 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" event={"ID":"39b1b2e8-ca98-481c-bb6c-d146b4f21d26","Type":"ContainerDied","Data":"fa67df5513c36dc493f67f03dd64ea6d75da6b069224b776bc6514012e1ee586"} Apr 22 20:29:56.523763 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.523644 2548 scope.go:117] "RemoveContainer" containerID="ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066" Apr 22 20:29:56.523763 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.523609 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk" Apr 22 20:29:56.531968 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.531947 2548 scope.go:117] "RemoveContainer" containerID="ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066" Apr 22 20:29:56.532218 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:29:56.532198 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066\": container with ID starting with ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066 not found: ID does not exist" containerID="ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066" Apr 22 20:29:56.532298 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.532227 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066"} err="failed to get container status \"ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066\": rpc error: code = NotFound desc = could not find container \"ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066\": container with ID starting with ce8fd92841a745100038cf777d9eb5e800bec16cfd6acbeea0335f6d69e81066 not found: ID does not exist" Apr 22 20:29:56.544176 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.544147 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk"] Apr 22 20:29:56.545654 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.545633 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b78d5-predictor-5f7f44746c-s8nmk"] Apr 22 20:29:56.952574 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:56.952484 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" path="/var/lib/kubelet/pods/39b1b2e8-ca98-481c-bb6c-d146b4f21d26/volumes" Apr 22 20:29:59.394724 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:29:59.394673 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:30:04.517897 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:04.517846 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:30:09.395417 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:09.395383 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" Apr 22 20:30:14.517938 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:14.517892 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:30:24.517421 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:24.517368 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:30:34.517161 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:34.517114 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:30:37.494116 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.494079 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk"] Apr 22 20:30:37.494597 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.494372 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" containerID="cri-o://1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff" gracePeriod=30 Apr 22 20:30:37.672674 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.672639 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2"] Apr 22 20:30:37.673048 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.673026 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" Apr 22 20:30:37.673048 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.673042 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" Apr 22 20:30:37.673229 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.673116 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="39b1b2e8-ca98-481c-bb6c-d146b4f21d26" containerName="kserve-container" Apr 22 20:30:37.676389 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.676368 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" Apr 22 20:30:37.681750 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.681726 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2"] Apr 22 20:30:37.687492 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.687472 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" Apr 22 20:30:37.822874 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:37.822849 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2"] Apr 22 20:30:37.827820 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:30:37.827788 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc455aa_5954_425d_b783_873cd79c49ce.slice/crio-4412b1ebd4c5a923a1d4dd2bf2b15bf9708c86b6c0118c03237d0823d55241f7 WatchSource:0}: Error finding container 4412b1ebd4c5a923a1d4dd2bf2b15bf9708c86b6c0118c03237d0823d55241f7: Status 404 returned error can't find the container with id 4412b1ebd4c5a923a1d4dd2bf2b15bf9708c86b6c0118c03237d0823d55241f7 Apr 22 20:30:38.664150 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:38.664104 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" event={"ID":"9cc455aa-5954-425d-b783-873cd79c49ce","Type":"ContainerStarted","Data":"3c3a30799547fe49b9e390ec0651b61deea93731d100123cb771e545bb412724"} Apr 22 20:30:38.664150 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:38.664145 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" event={"ID":"9cc455aa-5954-425d-b783-873cd79c49ce","Type":"ContainerStarted","Data":"4412b1ebd4c5a923a1d4dd2bf2b15bf9708c86b6c0118c03237d0823d55241f7"} Apr 22 20:30:38.664577 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:38.664380 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" Apr 22 20:30:38.665589 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:38.665564 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:30:38.679097 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:38.679052 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" podStartSLOduration=1.6790407269999998 podStartE2EDuration="1.679040727s" podCreationTimestamp="2026-04-22 20:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:30:38.677651637 +0000 UTC m=+1986.266142966" watchObservedRunningTime="2026-04-22 20:30:38.679040727 +0000 UTC m=+1986.267532121" Apr 22 20:30:39.394271 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:39.394204 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 22 20:30:39.668429 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:39.668331 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:30:40.844036 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:40.844009 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" Apr 22 20:30:41.677589 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.677549 2548 generic.go:358] "Generic (PLEG): container finished" podID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerID="1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff" exitCode=0 Apr 22 20:30:41.677788 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.677618 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" Apr 22 20:30:41.677788 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.677624 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" event={"ID":"33f27af2-f92d-4344-aa45-d7d7375d18c0","Type":"ContainerDied","Data":"1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff"} Apr 22 20:30:41.677788 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.677723 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk" event={"ID":"33f27af2-f92d-4344-aa45-d7d7375d18c0","Type":"ContainerDied","Data":"f1634f4ce7c4981f1803416b0c5f47f0d7ab003172bdcd911f2d0ff64fd395f3"} Apr 22 20:30:41.677788 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.677738 2548 scope.go:117] "RemoveContainer" containerID="1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff" Apr 22 20:30:41.686147 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.686125 2548 scope.go:117] "RemoveContainer" containerID="1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff" Apr 22 20:30:41.686435 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:30:41.686414 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff\": container with ID starting with 1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff not found: ID does not exist" containerID="1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff" Apr 22 20:30:41.686489 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.686445 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff"} err="failed to get container status \"1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff\": rpc error: code = NotFound desc = could not find container \"1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff\": container with ID starting with 1a2019a61f69fa8195efcae3647aeb72a0560cac8d878a97838fba0b619e49ff not found: ID does not exist" Apr 22 20:30:41.693434 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.693410 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk"] Apr 22 20:30:41.696429 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:41.696406 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9d58e-predictor-b6d5cd5c7-9b7fk"] Apr 22 20:30:42.952136 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:42.952103 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" path="/var/lib/kubelet/pods/33f27af2-f92d-4344-aa45-d7d7375d18c0/volumes" Apr 22 20:30:44.519358 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:44.519320 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" Apr 22 20:30:49.668939 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:49.668892 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:30:59.669433 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:30:59.669382 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:31:09.669076 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:31:09.669029 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:31:19.668635 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:31:19.668583 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 22 20:31:29.669948 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:31:29.669920 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" Apr 22 20:32:32.988421 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:32:32.988387 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:32:32.999526 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:32:32.999498 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:37:33.011181 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:37:33.011154 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:37:33.023530 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:37:33.023507 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:40:02.501368 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:02.501320 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2"] Apr 22 20:40:02.501958 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:02.501632 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" containerID="cri-o://3c3a30799547fe49b9e390ec0651b61deea93731d100123cb771e545bb412724" gracePeriod=30 Apr 22 20:40:05.560886 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:05.560844 2548 generic.go:358] "Generic (PLEG): container finished" podID="9cc455aa-5954-425d-b783-873cd79c49ce" containerID="3c3a30799547fe49b9e390ec0651b61deea93731d100123cb771e545bb412724" exitCode=0 Apr 22 20:40:05.560886 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:05.560894 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" event={"ID":"9cc455aa-5954-425d-b783-873cd79c49ce","Type":"ContainerDied","Data":"3c3a30799547fe49b9e390ec0651b61deea93731d100123cb771e545bb412724"} Apr 22 20:40:05.655312 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:05.655288 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" Apr 22 20:40:06.565807 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:06.565761 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" event={"ID":"9cc455aa-5954-425d-b783-873cd79c49ce","Type":"ContainerDied","Data":"4412b1ebd4c5a923a1d4dd2bf2b15bf9708c86b6c0118c03237d0823d55241f7"} Apr 22 20:40:06.566215 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:06.565817 2548 scope.go:117] "RemoveContainer" containerID="3c3a30799547fe49b9e390ec0651b61deea93731d100123cb771e545bb412724" Apr 22 20:40:06.566215 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:06.565776 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2" Apr 22 20:40:06.586894 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:06.586869 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2"] Apr 22 20:40:06.593027 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:06.593000 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-61b05-predictor-7ff5f7bc75-8jlj2"] Apr 22 20:40:06.952590 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:40:06.952507 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" path="/var/lib/kubelet/pods/9cc455aa-5954-425d-b783-873cd79c49ce/volumes" Apr 22 20:42:33.034173 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:42:33.034060 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:42:33.047982 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:42:33.047960 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:47:22.564619 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:22.564581 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6"] Apr 22 20:47:22.565220 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:22.564858 2548 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" containerID="cri-o://c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3" gracePeriod=30 Apr 22 20:47:24.517732 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:24.517677 2548 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 22 20:47:25.811369 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:25.811341 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" Apr 22 20:47:26.078948 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.078909 2548 generic.go:358] "Generic (PLEG): container finished" podID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerID="c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3" exitCode=0 Apr 22 20:47:26.079145 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.078976 2548 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" Apr 22 20:47:26.079145 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.078970 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" event={"ID":"660a84d3-b2fe-4452-b05f-31a77de208ff","Type":"ContainerDied","Data":"c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3"} Apr 22 20:47:26.079145 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.079088 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6" event={"ID":"660a84d3-b2fe-4452-b05f-31a77de208ff","Type":"ContainerDied","Data":"ed7671eb8f06289bfcefd146bc763c8a7bbb17790e16b1ff7d972e527c2d5e4e"} Apr 22 20:47:26.079145 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.079110 2548 scope.go:117] "RemoveContainer" containerID="c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3" Apr 22 20:47:26.087856 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.087836 2548 scope.go:117] "RemoveContainer" containerID="c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3" Apr 22 20:47:26.088096 ip-10-0-139-10 kubenswrapper[2548]: E0422 20:47:26.088075 2548 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3\": container with ID starting with c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3 not found: ID does not exist" containerID="c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3" Apr 22 20:47:26.088157 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.088103 2548 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3"} err="failed to get container status \"c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3\": rpc error: code = NotFound desc = could not find container \"c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3\": container with ID starting with c73d40245bc4c13d864885176e44d16619c51be0e7c7e74f229bf11dace777c3 not found: ID does not exist" Apr 22 20:47:26.098716 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.098693 2548 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6"] Apr 22 20:47:26.101843 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.101821 2548 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2a29-predictor-d65d6bb67-6f5p6"] Apr 22 20:47:26.953403 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:26.953360 2548 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" path="/var/lib/kubelet/pods/660a84d3-b2fe-4452-b05f-31a77de208ff/volumes" Apr 22 20:47:33.058084 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:33.057976 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:47:33.071857 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:33.071838 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:47:50.467656 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:50.467610 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zcc7b_0ec9b5fe-fc19-49f4-972f-2a642d1424b6/global-pull-secret-syncer/0.log" Apr 22 20:47:50.550688 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:50.550654 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-r6qd8_9d4bf3cd-4394-40c1-a8fc-3a9c169a083c/konnectivity-agent/0.log" Apr 22 20:47:50.641625 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:50.641590 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-10.ec2.internal_baab329e26e3fec548046283f03a6805/haproxy/0.log" Apr 22 20:47:54.091520 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.091487 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1cb0d0fc-e57f-448b-8881-87ee9514f5cf/alertmanager/0.log" Apr 22 20:47:54.116039 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.116006 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1cb0d0fc-e57f-448b-8881-87ee9514f5cf/config-reloader/0.log" Apr 22 20:47:54.136775 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.136747 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1cb0d0fc-e57f-448b-8881-87ee9514f5cf/kube-rbac-proxy-web/0.log" Apr 22 20:47:54.159007 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.158977 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1cb0d0fc-e57f-448b-8881-87ee9514f5cf/kube-rbac-proxy/0.log" Apr 22 20:47:54.181104 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.181069 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1cb0d0fc-e57f-448b-8881-87ee9514f5cf/kube-rbac-proxy-metric/0.log" Apr 22 20:47:54.200767 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.200736 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1cb0d0fc-e57f-448b-8881-87ee9514f5cf/prom-label-proxy/0.log" Apr 22 20:47:54.223164 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.223141 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1cb0d0fc-e57f-448b-8881-87ee9514f5cf/init-config-reloader/0.log" Apr 22 20:47:54.508503 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.508462 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l4m6g_f2f0b7a0-1bba-4840-81dd-1944c681644b/node-exporter/0.log" Apr 22 20:47:54.527954 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.527923 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l4m6g_f2f0b7a0-1bba-4840-81dd-1944c681644b/kube-rbac-proxy/0.log" Apr 22 20:47:54.550151 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.550122 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l4m6g_f2f0b7a0-1bba-4840-81dd-1944c681644b/init-textfile/0.log" Apr 22 20:47:54.964114 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.964033 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56b7f58d88-62kzz_2fff136a-97e6-4ad1-837f-9941016a24d3/telemeter-client/0.log" Apr 22 20:47:54.986039 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:54.986009 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56b7f58d88-62kzz_2fff136a-97e6-4ad1-837f-9941016a24d3/reload/0.log" Apr 22 20:47:55.004233 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:55.004203 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56b7f58d88-62kzz_2fff136a-97e6-4ad1-837f-9941016a24d3/kube-rbac-proxy/0.log" Apr 22 20:47:56.967824 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:56.967793 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bbdd8fdbf-27d5v_9471ff1a-5151-4738-b94b-073f81a2084b/console/0.log" Apr 22 20:47:56.996851 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:56.996819 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-wgmf6_cdf0fb12-754a-46f7-a133-7cb4a81e6bdb/download-server/0.log" Apr 22 20:47:57.504733 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.504702 2548 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp"] Apr 22 20:47:57.505073 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505061 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" Apr 22 20:47:57.505116 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505075 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" Apr 22 20:47:57.505116 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505089 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" Apr 22 20:47:57.505116 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505095 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" Apr 22 20:47:57.505116 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505109 2548 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" Apr 22 20:47:57.505116 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505115 2548 state_mem.go:107] "Deleted CPUSet assignment" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" Apr 22 20:47:57.505288 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505164 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="660a84d3-b2fe-4452-b05f-31a77de208ff" containerName="kserve-container" Apr 22 20:47:57.505288 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505173 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cc455aa-5954-425d-b783-873cd79c49ce" containerName="kserve-container" Apr 22 20:47:57.505288 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.505180 2548 memory_manager.go:356] "RemoveStaleState removing state" podUID="33f27af2-f92d-4344-aa45-d7d7375d18c0" containerName="kserve-container" Apr 22 20:47:57.508145 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.508128 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.510754 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.510720 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xlhgw\"/\"openshift-service-ca.crt\"" Apr 22 20:47:57.510874 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.510787 2548 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xlhgw\"/\"kube-root-ca.crt\"" Apr 22 20:47:57.512140 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.512122 2548 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xlhgw\"/\"default-dockercfg-pm84g\"" Apr 22 20:47:57.516845 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.516819 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp"] Apr 22 20:47:57.607085 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.607033 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-lib-modules\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.607297 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.607119 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-podres\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.607297 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.607140 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-proc\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.607297 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.607166 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd59j\" (UniqueName: \"kubernetes.io/projected/04576450-683c-4f82-b944-3b3914a308fe-kube-api-access-fd59j\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.607297 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.607278 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-sys\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.707704 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707665 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-sys\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.707704 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707711 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-lib-modules\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.707940 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707760 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-podres\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.707940 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707780 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-proc\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.707940 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707791 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-sys\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.707940 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707815 2548 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd59j\" (UniqueName: \"kubernetes.io/projected/04576450-683c-4f82-b944-3b3914a308fe-kube-api-access-fd59j\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.707940 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707859 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-proc\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.707940 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707929 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-podres\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.708294 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.707930 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04576450-683c-4f82-b944-3b3914a308fe-lib-modules\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.715578 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.715560 2548 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd59j\" (UniqueName: \"kubernetes.io/projected/04576450-683c-4f82-b944-3b3914a308fe-kube-api-access-fd59j\") pod \"perf-node-gather-daemonset-wt6kp\" (UID: \"04576450-683c-4f82-b944-3b3914a308fe\") " pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.818629 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.818520 2548 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:57.944839 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.944813 2548 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp"] Apr 22 20:47:57.947490 ip-10-0-139-10 kubenswrapper[2548]: W0422 20:47:57.947455 2548 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod04576450_683c_4f82_b944_3b3914a308fe.slice/crio-193f58b90e8707da06dd9d8a97bdcee030f692de25d7e2a481cfb60b2e3003b4 WatchSource:0}: Error finding container 193f58b90e8707da06dd9d8a97bdcee030f692de25d7e2a481cfb60b2e3003b4: Status 404 returned error can't find the container with id 193f58b90e8707da06dd9d8a97bdcee030f692de25d7e2a481cfb60b2e3003b4 Apr 22 20:47:57.949242 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.949225 2548 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:47:57.998029 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:57.998003 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8pzdx_7fa06416-712c-490d-a430-2c086187fab9/dns/0.log" Apr 22 20:47:58.017334 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:58.017314 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8pzdx_7fa06416-712c-490d-a430-2c086187fab9/kube-rbac-proxy/0.log" Apr 22 20:47:58.164468 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:58.164386 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tscxv_e37b1adb-be11-4c0a-beea-dbf70b8cda38/dns-node-resolver/0.log" Apr 22 20:47:58.186314 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:58.186280 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" event={"ID":"04576450-683c-4f82-b944-3b3914a308fe","Type":"ContainerStarted","Data":"bbd028ff6ca9fa67c13f1978ad6cf20a3aaf116c9b140c095400f314e4b7f37c"} Apr 22 20:47:58.186314 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:58.186316 2548 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" event={"ID":"04576450-683c-4f82-b944-3b3914a308fe","Type":"ContainerStarted","Data":"193f58b90e8707da06dd9d8a97bdcee030f692de25d7e2a481cfb60b2e3003b4"} Apr 22 20:47:58.186540 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:58.186339 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:47:58.200853 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:58.200796 2548 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" podStartSLOduration=1.200778815 podStartE2EDuration="1.200778815s" podCreationTimestamp="2026-04-22 20:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:47:58.199888037 +0000 UTC m=+3025.788379382" watchObservedRunningTime="2026-04-22 20:47:58.200778815 +0000 UTC m=+3025.789270144" Apr 22 20:47:58.530542 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:58.530508 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7dfdc46845-k74fd_910df736-6a06-4200-9361-0fcde69a47e1/registry/0.log" Apr 22 20:47:58.570226 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:58.570203 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vqlnc_dfbb3072-c0b3-48da-8291-55700270a1f3/node-ca/0.log" Apr 22 20:47:59.617489 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:59.617462 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rr5vp_26ac7310-bd02-469f-9a0e-31a38e294dc3/serve-healthcheck-canary/0.log" Apr 22 20:47:59.945597 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:59.945517 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-kq5tv_33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd/insights-operator/0.log" Apr 22 20:47:59.947922 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:47:59.947902 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-kq5tv_33ee5e09-49ac-4cdb-a6ea-ebfaeb06c4bd/insights-operator/1.log" Apr 22 20:48:00.028810 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:00.028780 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g2p6k_c66bba96-eae4-461e-b602-745b3cb8cef1/kube-rbac-proxy/0.log" Apr 22 20:48:00.056549 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:00.056516 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g2p6k_c66bba96-eae4-461e-b602-745b3cb8cef1/exporter/0.log" Apr 22 20:48:00.076837 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:00.076809 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g2p6k_c66bba96-eae4-461e-b602-745b3cb8cef1/extractor/0.log" Apr 22 20:48:01.981087 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:01.981047 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-665c47d676-5lpr5_3ced6819-5e53-4bb5-b565-408c9f42f696/manager/0.log" Apr 22 20:48:02.020850 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:02.020826 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-4sz4d_7829822e-6695-42b9-a06c-489c7eba9e77/server/0.log" Apr 22 20:48:02.308355 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:02.308281 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-g7qwk_08b05a5c-381d-4156-9f3c-02cee169527a/seaweedfs/0.log" Apr 22 20:48:04.200744 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:04.200717 2548 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xlhgw/perf-node-gather-daemonset-wt6kp" Apr 22 20:48:06.006776 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:06.006748 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-lth8q_27321596-248a-4a3c-b6c9-64b406655f9f/kube-storage-version-migrator-operator/1.log" Apr 22 20:48:06.007810 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:06.007788 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-lth8q_27321596-248a-4a3c-b6c9-64b406655f9f/kube-storage-version-migrator-operator/0.log" Apr 22 20:48:07.156381 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.156354 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lq662_8071e1d3-8155-4265-9de0-c92543778149/kube-multus-additional-cni-plugins/0.log" Apr 22 20:48:07.176162 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.176138 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lq662_8071e1d3-8155-4265-9de0-c92543778149/egress-router-binary-copy/0.log" Apr 22 20:48:07.194990 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.194970 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lq662_8071e1d3-8155-4265-9de0-c92543778149/cni-plugins/0.log" Apr 22 20:48:07.214976 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.214955 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lq662_8071e1d3-8155-4265-9de0-c92543778149/bond-cni-plugin/0.log" Apr 22 20:48:07.235839 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.235806 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lq662_8071e1d3-8155-4265-9de0-c92543778149/routeoverride-cni/0.log" Apr 22 20:48:07.256013 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.255987 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lq662_8071e1d3-8155-4265-9de0-c92543778149/whereabouts-cni-bincopy/0.log" Apr 22 20:48:07.278042 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.278012 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lq662_8071e1d3-8155-4265-9de0-c92543778149/whereabouts-cni/0.log" Apr 22 20:48:07.472196 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.472124 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zkw8l_ef5f98dc-99df-42ee-b6ba-f81c8f509e56/kube-multus/0.log" Apr 22 20:48:07.531527 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.531498 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jjztz_450a901e-1810-4879-8bc6-97efb2b1c9d9/network-metrics-daemon/0.log" Apr 22 20:48:07.548455 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:07.548413 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jjztz_450a901e-1810-4879-8bc6-97efb2b1c9d9/kube-rbac-proxy/0.log" Apr 22 20:48:08.306326 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.306301 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-controller/0.log" Apr 22 20:48:08.324013 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.323990 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/0.log" Apr 22 20:48:08.338038 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.338008 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovn-acl-logging/1.log" Apr 22 20:48:08.355575 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.355555 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/kube-rbac-proxy-node/0.log" Apr 22 20:48:08.373370 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.373343 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:48:08.389891 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.389866 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/northd/0.log" Apr 22 20:48:08.408007 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.407985 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/nbdb/0.log" Apr 22 20:48:08.426497 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.426474 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/sbdb/0.log" Apr 22 20:48:08.516583 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:08.516558 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqc2r_3a7bdf57-222a-4e36-b827-d320c2eaaac4/ovnkube-controller/0.log" Apr 22 20:48:09.918316 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:09.918289 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vrwhw_7d2588aa-5729-48a4-bd64-f7fec4d7f0fd/check-endpoints/0.log" Apr 22 20:48:09.981940 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:09.981916 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-pk9ff_2d7bbd1b-a8ac-4542-8b29-4dbfeb3569cf/network-check-target-container/0.log" Apr 22 20:48:10.811305 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:10.811273 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5xnhn_41898967-cc03-4d1a-a021-cc3f7817d848/iptables-alerter/0.log" Apr 22 20:48:11.370330 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:11.370299 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5hz9q_0f6577ea-fdb6-4a04-8c7a-6cee5fdfcc76/tuned/0.log" Apr 22 20:48:12.968505 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:12.968474 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-zckhg_edf1a1d3-9426-4a01-9072-146ecaba47db/cluster-samples-operator/0.log" Apr 22 20:48:12.985204 ip-10-0-139-10 kubenswrapper[2548]: I0422 20:48:12.985181 2548 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-zckhg_edf1a1d3-9426-4a01-9072-146ecaba47db/cluster-samples-operator-watch/0.log"