Apr 22 14:12:43.885162 ip-10-0-136-45 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 14:12:43.885174 ip-10-0-136-45 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 14:12:43.885183 ip-10-0-136-45 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 14:12:43.885399 ip-10-0-136-45 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 14:12:53.915911 ip-10-0-136-45 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 14:12:53.915928 ip-10-0-136-45 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 57d62fb64a374aeea83a49973366015f -- Apr 22 14:15:15.084948 ip-10-0-136-45 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:15.519046 ip-10-0-136-45 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:15.519046 ip-10-0-136-45 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:15.519046 ip-10-0-136-45 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:15.519046 ip-10-0-136-45 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:15.519046 ip-10-0-136-45 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:15.519940 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.519849 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:15.522949 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522934 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:15.522949 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522949 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522953 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522957 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522959 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522965 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522968 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522971 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522973 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522976 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522980 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522984 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522987 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522995 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.522998 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523001 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523003 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523006 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523008 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523011 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:15.523010 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523013 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523017 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523019 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523022 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523025 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523027 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523030 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523032 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523034 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523037 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523039 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523042 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523044 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523046 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523049 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523051 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523054 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523057 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523059 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523061 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:15.523485 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523064 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523067 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523070 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523072 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523074 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523077 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523079 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523081 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523083 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523086 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523088 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523091 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523093 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523096 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523099 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523102 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523104 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523106 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523109 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523111 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:15.523971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523114 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523116 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523119 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523121 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523124 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523126 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523129 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523131 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523134 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523138 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523142 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523145 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523147 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523150 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523152 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523154 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523157 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523159 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523162 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:15.524445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523164 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523167 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523169 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523171 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523174 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523176 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523179 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523541 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523545 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523548 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523551 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523553 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523557 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523560 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523563 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523566 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523569 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523571 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523573 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523576 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:15.524926 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523579 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523581 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523584 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523587 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523589 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523591 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523594 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523597 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523599 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523602 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523604 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523607 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523609 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523612 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523614 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523616 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523619 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523623 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523627 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523631 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:15.525394 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523635 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523637 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523640 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523643 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523645 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523648 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523651 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523653 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523655 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523658 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523660 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523663 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523666 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523668 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523670 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523673 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523675 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523678 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523680 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:15.525895 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523695 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523697 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523700 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523703 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523705 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523707 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523710 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523712 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523715 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523717 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523720 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523722 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523726 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523728 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523731 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523735 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523737 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523740 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523743 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:15.526355 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523745 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523747 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523750 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523752 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523755 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523757 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523761 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523764 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523766 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523769 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523771 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523774 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523776 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523778 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.523781 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523851 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523858 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523865 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523869 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523873 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523877 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523881 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:15.526829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523888 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523891 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523894 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523897 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523901 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523904 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523907 2573 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523910 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523913 2573 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523916 2573 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523918 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523921 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523925 2573 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523928 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523931 2573 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523934 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523937 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523941 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523945 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523948 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523951 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523954 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523957 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523960 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523963 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:15.527442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523966 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523970 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523973 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523976 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523979 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523983 2573 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523985 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523990 2573 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523993 2573 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523996 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.523999 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524002 2573 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524006 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524009 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524012 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524014 2573 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524017 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524020 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524023 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524026 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524028 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524031 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524034 2573 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524037 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524040 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:15.528056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524044 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524047 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524050 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524053 2573 flags.go:64] FLAG: --help="false" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524056 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524059 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524062 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524064 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524068 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524071 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524074 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524077 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524080 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524083 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524085 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524088 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524091 2573 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524094 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524097 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524100 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524103 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524105 2573 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524108 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524111 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:15.528718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524114 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524119 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524121 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524124 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524127 2573 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524129 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524133 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524135 2573 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524139 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524143 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524147 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524151 2573 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524154 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524157 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524160 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524162 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524165 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524168 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524171 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524178 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524181 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524184 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524187 2573 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:15.529281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524190 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524196 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524198 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524201 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524204 2573 flags.go:64] FLAG: --port="10250" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524207 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524210 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0439737782e3f4db9" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524213 2573 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524216 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524219 2573 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524222 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524225 2573 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524228 2573 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524231 2573 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524243 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524247 2573 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524251 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524255 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524258 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524260 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524264 2573 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524267 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524270 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524273 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524275 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524278 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:15.529853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524281 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524284 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524287 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524290 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524293 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524296 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524299 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524301 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524304 2573 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524307 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524312 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524315 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524318 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524322 2573 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524324 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524327 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524330 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524333 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524335 2573 flags.go:64] FLAG: --v="2" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524339 2573 flags.go:64] FLAG: --version="false" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524343 2573 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524347 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524350 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524463 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:15.530495 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524468 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524471 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524474 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524483 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524486 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524489 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524491 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524494 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524496 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524499 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524502 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524505 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524508 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524510 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524513 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524515 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524517 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524520 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524522 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524525 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:15.531082 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524527 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524530 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524532 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524535 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524537 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524540 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524542 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524548 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524553 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524556 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524560 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524564 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524567 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524570 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524572 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524575 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524578 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524581 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524583 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524586 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:15.531572 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524588 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524591 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524593 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524596 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524598 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524601 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524603 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524605 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524608 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524610 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524613 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524615 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524618 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524620 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524622 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524625 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524627 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524630 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524632 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524636 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:15.532063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524638 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524641 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524644 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524651 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524654 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524657 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524659 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524662 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524664 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524667 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524670 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524673 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524675 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524678 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524680 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524694 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524697 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524700 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524703 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:15.532547 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524705 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524708 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524711 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524714 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524716 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.524719 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.524725 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.531826 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.531840 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531886 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531891 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531894 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531897 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531900 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531903 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531905 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:15.533064 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531908 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531911 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531913 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531916 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531919 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531921 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531924 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531926 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531929 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531932 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531934 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531937 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531939 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531942 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531944 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531947 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531949 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531952 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531954 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531956 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:15.533445 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531959 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531961 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531964 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531966 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531970 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531973 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531975 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531978 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531980 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531982 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531985 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531987 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531990 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531992 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531995 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.531997 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532000 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532002 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532004 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:15.533971 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532007 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532009 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532012 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532015 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532017 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532019 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532022 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532024 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532027 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532029 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532031 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532034 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532036 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532039 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532041 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532044 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532046 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532051 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532056 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532059 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:15.534423 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532061 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532064 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532067 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532070 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532073 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532075 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532078 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532081 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532083 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532086 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532089 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532094 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532098 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532100 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532103 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532106 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532109 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532111 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532114 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:15.534924 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532116 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.532122 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532248 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532255 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532258 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532260 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532263 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532266 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532268 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532271 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532274 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532277 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532285 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532288 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532291 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532294 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:15.535362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532296 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532299 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532302 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532304 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532307 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532309 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532312 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532314 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532317 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532319 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532322 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532324 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532327 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532329 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532332 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532335 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532337 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532339 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532342 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532345 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:15.535755 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532347 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532350 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532352 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532354 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532357 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532359 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532362 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532366 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532369 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532377 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532380 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532383 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532386 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532390 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532394 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532396 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532399 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532401 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532403 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:15.536223 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532406 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532408 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532411 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532413 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532416 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532418 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532421 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532423 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532425 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532428 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532430 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532433 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532435 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532438 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532440 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532443 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532445 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532447 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532457 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532460 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:15.536682 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532463 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532465 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532467 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532476 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532478 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532481 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532483 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532485 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532488 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532490 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532493 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532495 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:15.532497 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.532502 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.533467 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:15.537303 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.537015 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:15.537944 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.537933 2573 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:15.538047 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.538028 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:15.538091 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.538065 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:15.566277 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.566230 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:15.570429 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.570382 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:15.587269 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.587243 2573 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:15.592867 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.592844 2573 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:15.594216 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.594201 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:15.594440 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.594425 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:15.597854 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.597834 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8b34dd73-97d3-42d9-af71-373a8228642f:/dev/nvme0n1p4 8e656ce0-8d01-47c0-8d26-cea1dda05e05:/dev/nvme0n1p3] Apr 22 14:15:15.597917 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.597852 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:15.604759 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.604628 2573 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:15.602711848 +0000 UTC m=+0.400474460 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3206703 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2129e3b7f3a3e301b3781f7a0ce8dd SystemUUID:ec2129e3-b7f3-a3e3-01b3-781f7a0ce8dd BootID:57d62fb6-4a37-4aee-a83a-49973366015f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2d:e8:d4:4e:49 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2d:e8:d4:4e:49 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:20:b3:af:f0:66 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:15.604759 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.604752 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:15.604869 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.604833 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:15.606000 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.605977 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:15.606137 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.606002 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-45.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:15.606184 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.606148 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:15.606184 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.606157 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:15.606184 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.606173 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:15.607572 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.607563 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:15.608910 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.608900 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:15.609178 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.609169 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:15.612118 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.612108 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:15.612164 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.612122 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:15.612164 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.612140 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:15.612164 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.612157 2573 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:15.612284 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.612166 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:15.613266 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.613255 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:15.613311 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.613272 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:15.618968 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.618942 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:15.620621 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.620606 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:15.622342 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622329 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:15.622342 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622345 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622351 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622356 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622361 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622367 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622372 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622378 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622385 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622391 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622399 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:15.622442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.622408 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:15.623281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.623270 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:15.623281 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.623281 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:15.626774 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.626761 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:15.626853 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.626789 2573 server.go:1295] "Started kubelet" Apr 22 14:15:15.626915 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.626872 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-45.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:15:15.626973 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.626898 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:15.626973 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.626923 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:15.626973 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.626954 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:15.627639 ip-10-0-136-45 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:15.627840 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.627816 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:15.627840 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.627820 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-45.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:15.628299 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.628285 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:15.629550 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.629534 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:15.633111 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.632315 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-45.ec2.internal.18a8b368c8ef85e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-45.ec2.internal,UID:ip-10-0-136-45.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-45.ec2.internal,},FirstTimestamp:2026-04-22 14:15:15.626771943 +0000 UTC m=+0.424534555,LastTimestamp:2026-04-22 14:15:15.626771943 +0000 UTC m=+0.424534555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-45.ec2.internal,}" Apr 22 14:15:15.633428 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.633410 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:15.634098 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.634080 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:15.634888 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.634871 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:15.634888 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.634888 2573 factory.go:55] Registering systemd factory Apr 22 14:15:15.635028 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.634897 2573 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:15.635028 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.634912 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:15.635028 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.634920 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:15.635028 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.634945 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:15.635194 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.635100 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:15.635194 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.635126 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:15.635194 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.635161 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:15.635453 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.635437 2573 factory.go:153] Registering CRI-O factory Apr 22 14:15:15.635453 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.635452 2573 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:15.635574 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.635472 2573 factory.go:103] Registering Raw factory Apr 22 14:15:15.635574 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.635482 2573 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:15.636588 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.635838 2573 manager.go:319] Starting recovery of all containers Apr 22 14:15:15.639925 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.639893 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2fkqx" Apr 22 14:15:15.645759 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.645509 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2fkqx" Apr 22 14:15:15.646682 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.646653 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-45.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 14:15:15.646891 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.646828 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 14:15:15.648803 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.648789 2573 manager.go:324] Recovery completed Apr 22 14:15:15.652722 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.652710 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:15.654926 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.654911 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:15.654999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.654936 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:15.654999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.654945 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:15.655381 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.655368 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:15.655381 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.655380 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:15.655450 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.655395 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:15.657055 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.656996 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-45.ec2.internal.18a8b368ca9d193c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-45.ec2.internal,UID:ip-10-0-136-45.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-45.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-45.ec2.internal,},FirstTimestamp:2026-04-22 14:15:15.654924604 +0000 UTC m=+0.452687216,LastTimestamp:2026-04-22 14:15:15.654924604 +0000 UTC m=+0.452687216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-45.ec2.internal,}" Apr 22 14:15:15.657875 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.657863 2573 policy_none.go:49] "None policy: Start" Apr 22 14:15:15.657926 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.657877 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:15.657926 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.657887 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:15.697255 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.697239 2573 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.697272 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.697312 2573 server.go:85] "Starting device plugin registration server" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.697578 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.697590 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.697709 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.697786 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.697795 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.698224 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:15.712130 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.698258 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:15.771750 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.771676 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:15.772909 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.772897 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:15.772986 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.772924 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:15.772986 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.772954 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:15.772986 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.772964 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:15.773114 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.773000 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:15.775450 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.775434 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:15.798634 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.798619 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:15.799546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.799531 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:15.799636 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.799556 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:15.799636 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.799567 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:15.799636 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.799596 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.807378 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.807363 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.807446 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.807382 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-45.ec2.internal\": node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:15.824415 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.824395 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:15.873337 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.873318 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal"] Apr 22 14:15:15.873406 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.873373 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:15.874752 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.874739 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:15.874819 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.874764 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:15.874819 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.874789 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:15.876111 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876100 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:15.876256 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.876297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876270 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:15.876779 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876758 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:15.876779 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876769 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:15.876779 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876782 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:15.876929 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876792 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:15.876929 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876792 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:15.876929 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.876822 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:15.878029 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.878014 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.878111 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.878039 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:15.878607 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.878593 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:15.878668 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.878624 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:15.878668 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.878634 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:15.898192 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.898175 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-45.ec2.internal\" not found" node="ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.902481 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.902468 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-45.ec2.internal\" not found" node="ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.925499 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:15.925483 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:15.936538 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.936521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1220a96b1f6f6de885f83afbd7a8dd79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal\" (UID: \"1220a96b1f6f6de885f83afbd7a8dd79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.936604 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.936543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1220a96b1f6f6de885f83afbd7a8dd79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal\" (UID: \"1220a96b1f6f6de885f83afbd7a8dd79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:15.936604 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:15.936561 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff0e9bc99f6afd5adc8facc8707d2004-config\") pod \"kube-apiserver-proxy-ip-10-0-136-45.ec2.internal\" (UID: \"ff0e9bc99f6afd5adc8facc8707d2004\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.025927 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:16.025867 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:16.037223 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.037199 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1220a96b1f6f6de885f83afbd7a8dd79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal\" (UID: \"1220a96b1f6f6de885f83afbd7a8dd79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.037223 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.037192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1220a96b1f6f6de885f83afbd7a8dd79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal\" (UID: \"1220a96b1f6f6de885f83afbd7a8dd79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.037343 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.037250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1220a96b1f6f6de885f83afbd7a8dd79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal\" (UID: \"1220a96b1f6f6de885f83afbd7a8dd79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.037343 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.037273 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff0e9bc99f6afd5adc8facc8707d2004-config\") pod \"kube-apiserver-proxy-ip-10-0-136-45.ec2.internal\" (UID: \"ff0e9bc99f6afd5adc8facc8707d2004\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.037343 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.037275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1220a96b1f6f6de885f83afbd7a8dd79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal\" (UID: \"1220a96b1f6f6de885f83afbd7a8dd79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.037343 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.037298 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff0e9bc99f6afd5adc8facc8707d2004-config\") pod \"kube-apiserver-proxy-ip-10-0-136-45.ec2.internal\" (UID: \"ff0e9bc99f6afd5adc8facc8707d2004\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.126537 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:16.126517 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:16.200049 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.200024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.205617 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.205602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.227469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:16.227446 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:16.328026 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:16.327964 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:16.428428 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:16.428406 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:16.528935 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:16.528913 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:16.538223 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.538204 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:16.538349 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.538335 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:16.629493 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:16.629434 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-45.ec2.internal\" not found" Apr 22 14:15:16.633511 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.633495 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:16.640971 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.640955 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:16.642224 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.642204 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:16.649170 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.649145 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:15 +0000 UTC" deadline="2027-10-28 21:06:31.263364801 +0000 UTC" Apr 22 14:15:16.649170 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.649169 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13302h51m14.614198556s" Apr 22 14:15:16.710057 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.710031 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-px2km" Apr 22 14:15:16.718050 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.718028 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-px2km" Apr 22 14:15:16.734806 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.734789 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.743730 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.743706 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:16.745331 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.745316 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" Apr 22 14:15:16.755989 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.755966 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:16.766109 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:16.766080 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0e9bc99f6afd5adc8facc8707d2004.slice/crio-3022c6bbcf33d69a69b3590614c798668ed6988cb3568401baf4a037ad373f4e WatchSource:0}: Error finding container 3022c6bbcf33d69a69b3590614c798668ed6988cb3568401baf4a037ad373f4e: Status 404 returned error can't find the container with id 3022c6bbcf33d69a69b3590614c798668ed6988cb3568401baf4a037ad373f4e Apr 22 14:15:16.766526 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:16.766509 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1220a96b1f6f6de885f83afbd7a8dd79.slice/crio-486b24ff9f42b1fa9e188775e940a62fc18746cb54c22579ca01afd1731d75b1 WatchSource:0}: Error finding container 486b24ff9f42b1fa9e188775e940a62fc18746cb54c22579ca01afd1731d75b1: Status 404 returned error can't find the container with id 486b24ff9f42b1fa9e188775e940a62fc18746cb54c22579ca01afd1731d75b1 Apr 22 14:15:16.770793 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.770773 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:16.776107 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.776053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" event={"ID":"1220a96b1f6f6de885f83afbd7a8dd79","Type":"ContainerStarted","Data":"486b24ff9f42b1fa9e188775e940a62fc18746cb54c22579ca01afd1731d75b1"} Apr 22 14:15:16.777041 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.777019 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" event={"ID":"ff0e9bc99f6afd5adc8facc8707d2004","Type":"ContainerStarted","Data":"3022c6bbcf33d69a69b3590614c798668ed6988cb3568401baf4a037ad373f4e"} Apr 22 14:15:16.957008 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:16.956943 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:17.018879 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.018848 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:17.613782 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.613655 2573 apiserver.go:52] "Watching apiserver" Apr 22 14:15:17.620568 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.620538 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:17.623278 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.623256 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bdw65","openshift-multus/network-metrics-daemon-qh8tk","openshift-network-operator/iptables-alerter-l2j5r","kube-system/konnectivity-agent-cvntg","kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal","openshift-cluster-node-tuning-operator/tuned-5hdwm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal","openshift-network-diagnostics/network-check-target-mj694","openshift-ovn-kubernetes/ovnkube-node-t88vk","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j","openshift-image-registry/node-ca-vm6ts","openshift-multus/multus-additional-cni-plugins-zbj7k"] Apr 22 14:15:17.626319 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.626299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.627623 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.627320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:17.628611 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.628587 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.629342 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.629126 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:17.629342 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.629165 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:17.629342 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.629211 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2nd57\"" Apr 22 14:15:17.629530 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.629456 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:17.630432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.629831 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.631112 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.631090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.632576 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.632230 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xkp9b\"" Apr 22 14:15:17.632576 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.632305 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:17.632576 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.632230 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:17.632781 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.632674 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.633114 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-76fxw\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.633290 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.633889 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.633925 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fc8c7\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.634163 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.634214 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.634405 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-s27n5\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.634566 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.634569 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:17.635595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.634830 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:17.636364 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.636170 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:17.637440 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.636836 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:17.637440 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:17.636964 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:17.640086 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.639878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:17.640086 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:17.639939 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:17.641300 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.641280 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.642348 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.641443 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.644589 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6c47887-6a82-4f97-8cf9-82dee1757b34-serviceca\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.644702 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-etc-selinux\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.644702 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-sys-fs\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.644702 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-multus-certs\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.644861 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysconfig\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.644861 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644757 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysctl-d\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.644861 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-tuned\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.644861 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/992d1820-8cec-4643-a6fc-96f13a95fd10-host-slash\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.644861 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-cni-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-k8s-cni-cncf-io\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644919 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2fgk\" (UniqueName: \"kubernetes.io/projected/992d1820-8cec-4643-a6fc-96f13a95fd10-kube-api-access-n2fgk\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.644977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f603cf6-03b7-4b6b-aa45-5650e8076be3-konnectivity-ca\") pod \"konnectivity-agent-cvntg\" (UID: \"0f603cf6-03b7-4b6b-aa45-5650e8076be3\") " pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-cnibin\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-os-release\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-conf-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shmnt\" (UniqueName: \"kubernetes.io/projected/b6c47887-6a82-4f97-8cf9-82dee1757b34-kube-api-access-shmnt\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645121 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:17.645155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-etc-kubernetes\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.645523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645187 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/992d1820-8cec-4643-a6fc-96f13a95fd10-iptables-alerter-script\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.645523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f603cf6-03b7-4b6b-aa45-5650e8076be3-agent-certs\") pod \"konnectivity-agent-cvntg\" (UID: \"0f603cf6-03b7-4b6b-aa45-5650e8076be3\") " pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:17.645523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645279 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-lib-modules\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.645523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clq2p\" (UniqueName: \"kubernetes.io/projected/26e11385-11b5-468d-8f37-f0a6251cf9f8-kube-api-access-clq2p\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645818 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-registration-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-device-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-modprobe-d\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-systemd\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.645986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-run\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-sys\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26e11385-11b5-468d-8f37-f0a6251cf9f8-tmp\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmkjz\" (UniqueName: \"kubernetes.io/projected/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-kube-api-access-wmkjz\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646124 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-system-cni-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646279 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vwwdk\"" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646587 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646586 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3885c220-9472-43c1-825a-2352438bbb35-cni-binary-copy\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6c47887-6a82-4f97-8cf9-82dee1757b34-host\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-netns\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-cni-bin\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.647502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-kubelet\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.646991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3885c220-9472-43c1-825a-2352438bbb35-multus-daemon-config\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647023 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ctg6\" (UniqueName: \"kubernetes.io/projected/3885c220-9472-43c1-825a-2352438bbb35-kube-api-access-5ctg6\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-kubernetes\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysctl-conf\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647134 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-socket-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlm7\" (UniqueName: \"kubernetes.io/projected/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-kube-api-access-nmlm7\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647259 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-socket-dir-parent\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-cni-multus\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-hostroot\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647464 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-var-lib-kubelet\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.647566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-host\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.648066 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.648077 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.648209 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:17.648369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.648221 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:17.649702 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.649570 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hrf9n\"" Apr 22 14:15:17.650154 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.649909 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:17.650261 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.650244 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:17.650437 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.650418 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:17.651107 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.651087 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-86fvw\"" Apr 22 14:15:17.671088 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.671065 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:17.718782 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.718737 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:16 +0000 UTC" deadline="2028-01-05 19:29:33.058323462 +0000 UTC" Apr 22 14:15:17.718782 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.718767 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14957h14m15.339559167s" Apr 22 14:15:17.735939 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.735912 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:17.748029 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6c47887-6a82-4f97-8cf9-82dee1757b34-host\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.748180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-netns\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.748180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-log-socket\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.748180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-cni-netd\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.748180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748106 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.748180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.748180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748127 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6c47887-6a82-4f97-8cf9-82dee1757b34-host\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.748180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-kubernetes\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.748503 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748186 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-netns\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.748503 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-socket-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.748503 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-kubernetes\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.748503 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-var-lib-kubelet\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.748503 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748436 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-host\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.748503 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-socket-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.748503 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748469 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-var-lib-kubelet\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.748503 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-etc-selinux\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-host\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-run-ovn-kubernetes\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6rb\" (UniqueName: \"kubernetes.io/projected/79402404-45d1-4ad9-b411-5640bfc88875-kube-api-access-xt6rb\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748611 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-tuned\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-etc-selinux\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-cni-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-k8s-cni-cncf-io\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748722 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-cni-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748809 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-k8s-cni-cncf-io\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-ovn\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-env-overrides\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.748927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-cnibin\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-os-release\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748972 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-cnibin\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-node-log\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.748952 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-ovnkube-script-lib\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-os-release\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f603cf6-03b7-4b6b-aa45-5650e8076be3-agent-certs\") pod \"konnectivity-agent-cvntg\" (UID: \"0f603cf6-03b7-4b6b-aa45-5650e8076be3\") " pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-lib-modules\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-lib-modules\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749253 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-registration-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3885c220-9472-43c1-825a-2352438bbb35-cni-binary-copy\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-socket-dir-parent\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-registration-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-cni-multus\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-cni-multus\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-systemd\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.749469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26e11385-11b5-468d-8f37-f0a6251cf9f8-tmp\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmkjz\" (UniqueName: \"kubernetes.io/projected/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-kube-api-access-wmkjz\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749493 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-socket-dir-parent\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-kubelet\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749544 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-systemd-units\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749549 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-systemd\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749572 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-etc-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-cni-bin\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-kubelet\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749620 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-slash\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3885c220-9472-43c1-825a-2352438bbb35-multus-daemon-config\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ctg6\" (UniqueName: \"kubernetes.io/projected/3885c220-9472-43c1-825a-2352438bbb35-kube-api-access-5ctg6\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749858 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-ovnkube-config\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-os-release\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3885c220-9472-43c1-825a-2352438bbb35-cni-binary-copy\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysctl-conf\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlm7\" (UniqueName: \"kubernetes.io/projected/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-kube-api-access-nmlm7\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.750211 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.749992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-run-netns\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6c47887-6a82-4f97-8cf9-82dee1757b34-serviceca\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-sys-fs\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-multus-certs\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-systemd\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750201 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysconfig\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750273 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysctl-d\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysctl-conf\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/992d1820-8cec-4643-a6fc-96f13a95fd10-host-slash\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-run-multus-certs\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-sys-fs\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysconfig\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.750954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750933 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-sysctl-d\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750980 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.750983 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3885c220-9472-43c1-825a-2352438bbb35-multus-daemon-config\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/992d1820-8cec-4643-a6fc-96f13a95fd10-host-slash\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79402404-45d1-4ad9-b411-5640bfc88875-ovn-node-metrics-cert\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-system-cni-dir\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751106 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxr6j\" (UniqueName: \"kubernetes.io/projected/7da714b4-f642-4144-b84b-7f4d1f4cbe60-kube-api-access-rxr6j\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2fgk\" (UniqueName: \"kubernetes.io/projected/992d1820-8cec-4643-a6fc-96f13a95fd10-kube-api-access-n2fgk\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751237 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f603cf6-03b7-4b6b-aa45-5650e8076be3-konnectivity-ca\") pod \"konnectivity-agent-cvntg\" (UID: \"0f603cf6-03b7-4b6b-aa45-5650e8076be3\") " pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shmnt\" (UniqueName: \"kubernetes.io/projected/b6c47887-6a82-4f97-8cf9-82dee1757b34-kube-api-access-shmnt\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-etc-kubernetes\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751319 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-var-lib-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751377 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751394 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6c47887-6a82-4f97-8cf9-82dee1757b34-serviceca\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-conf-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.751546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751437 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-etc-kubernetes\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/992d1820-8cec-4643-a6fc-96f13a95fd10-iptables-alerter-script\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751829 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-multus-conf-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clq2p\" (UniqueName: \"kubernetes.io/projected/26e11385-11b5-468d-8f37-f0a6251cf9f8-kube-api-access-clq2p\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-device-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f603cf6-03b7-4b6b-aa45-5650e8076be3-konnectivity-ca\") pod \"konnectivity-agent-cvntg\" (UID: \"0f603cf6-03b7-4b6b-aa45-5650e8076be3\") " pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.751978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-hostroot\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cnibin\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-modprobe-d\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:17.752057 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-run\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-sys\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:17.752150 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:18.252105915 +0000 UTC m=+3.049868530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752151 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-sys\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-system-cni-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-cni-bin\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.752297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752235 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-kubelet\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.753061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-tuned\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.753061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-etc-modprobe-d\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.753061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-device-dir\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.753061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-system-cni-dir\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.753061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-host-var-lib-cni-bin\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.753061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26e11385-11b5-468d-8f37-f0a6251cf9f8-run\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.753061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3885c220-9472-43c1-825a-2352438bbb35-hostroot\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.753061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.752637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/992d1820-8cec-4643-a6fc-96f13a95fd10-iptables-alerter-script\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.753366 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.753147 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f603cf6-03b7-4b6b-aa45-5650e8076be3-agent-certs\") pod \"konnectivity-agent-cvntg\" (UID: \"0f603cf6-03b7-4b6b-aa45-5650e8076be3\") " pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:17.754922 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.754895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26e11385-11b5-468d-8f37-f0a6251cf9f8-tmp\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.758449 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.758396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ctg6\" (UniqueName: \"kubernetes.io/projected/3885c220-9472-43c1-825a-2352438bbb35-kube-api-access-5ctg6\") pod \"multus-bdw65\" (UID: \"3885c220-9472-43c1-825a-2352438bbb35\") " pod="openshift-multus/multus-bdw65" Apr 22 14:15:17.758637 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.758614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlm7\" (UniqueName: \"kubernetes.io/projected/4513952a-c2d3-4d8d-9bf2-ff87d4885b58-kube-api-access-nmlm7\") pod \"aws-ebs-csi-driver-node-vzx7j\" (UID: \"4513952a-c2d3-4d8d-9bf2-ff87d4885b58\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.759832 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.759810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmkjz\" (UniqueName: \"kubernetes.io/projected/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-kube-api-access-wmkjz\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:17.759966 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.759947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2fgk\" (UniqueName: \"kubernetes.io/projected/992d1820-8cec-4643-a6fc-96f13a95fd10-kube-api-access-n2fgk\") pod \"iptables-alerter-l2j5r\" (UID: \"992d1820-8cec-4643-a6fc-96f13a95fd10\") " pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.760643 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.760620 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shmnt\" (UniqueName: \"kubernetes.io/projected/b6c47887-6a82-4f97-8cf9-82dee1757b34-kube-api-access-shmnt\") pod \"node-ca-vm6ts\" (UID: \"b6c47887-6a82-4f97-8cf9-82dee1757b34\") " pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:17.760800 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.760776 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clq2p\" (UniqueName: \"kubernetes.io/projected/26e11385-11b5-468d-8f37-f0a6251cf9f8-kube-api-access-clq2p\") pod \"tuned-5hdwm\" (UID: \"26e11385-11b5-468d-8f37-f0a6251cf9f8\") " pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.852965 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.852923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-run-ovn-kubernetes\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.852965 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.852971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6rb\" (UniqueName: \"kubernetes.io/projected/79402404-45d1-4ad9-b411-5640bfc88875-kube-api-access-xt6rb\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-run-ovn-kubernetes\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:17.853212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-ovn\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-env-overrides\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-node-log\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-node-log\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-ovn\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-ovnkube-script-lib\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-systemd-units\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853385 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-etc-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-cni-bin\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-systemd-units\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-slash\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-slash\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-ovnkube-config\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-cni-bin\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853475 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-etc-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-os-release\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-run-netns\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853553 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-systemd\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-os-release\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-run-netns\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-env-overrides\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-run-systemd\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79402404-45d1-4ad9-b411-5640bfc88875-ovn-node-metrics-cert\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-system-cni-dir\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxr6j\" (UniqueName: \"kubernetes.io/projected/7da714b4-f642-4144-b84b-7f4d1f4cbe60-kube-api-access-rxr6j\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-var-lib-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.853916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-system-cni-dir\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853796 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853836 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-ovnkube-script-lib\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853885 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cnibin\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853955 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-var-lib-openvswitch\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853966 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79402404-45d1-4ad9-b411-5640bfc88875-ovnkube-config\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.853996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cnibin\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-kubelet\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-log-socket\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-cni-netd\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-kubelet\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-log-socket\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-cni-netd\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.854768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79402404-45d1-4ad9-b411-5640bfc88875-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.855545 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.855545 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.855545 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.854612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da714b4-f642-4144-b84b-7f4d1f4cbe60-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.855833 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.855811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79402404-45d1-4ad9-b411-5640bfc88875-ovn-node-metrics-cert\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.858644 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:17.858624 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:17.858644 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:17.858646 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:17.858837 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:17.858656 2573 projected.go:194] Error preparing data for projected volume kube-api-access-csqx8 for pod openshift-network-diagnostics/network-check-target-mj694: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:17.858837 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:17.858731 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8 podName:083554f0-d10f-417b-ac2a-68e07b68b98b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:18.358717568 +0000 UTC m=+3.156480182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-csqx8" (UniqueName: "kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8") pod "network-check-target-mj694" (UID: "083554f0-d10f-417b-ac2a-68e07b68b98b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:17.860601 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.860582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6rb\" (UniqueName: \"kubernetes.io/projected/79402404-45d1-4ad9-b411-5640bfc88875-kube-api-access-xt6rb\") pod \"ovnkube-node-t88vk\" (UID: \"79402404-45d1-4ad9-b411-5640bfc88875\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:17.860997 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.860978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxr6j\" (UniqueName: \"kubernetes.io/projected/7da714b4-f642-4144-b84b-7f4d1f4cbe60-kube-api-access-rxr6j\") pod \"multus-additional-cni-plugins-zbj7k\" (UID: \"7da714b4-f642-4144-b84b-7f4d1f4cbe60\") " pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.946175 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.946091 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l2j5r" Apr 22 14:15:17.963893 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.963861 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:17.974507 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.974486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" Apr 22 14:15:17.981128 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.981109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" Apr 22 14:15:17.987758 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.987741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" Apr 22 14:15:17.994405 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:17.994385 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bdw65" Apr 22 14:15:18.002963 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.002936 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vm6ts" Apr 22 14:15:18.007591 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.007569 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:18.256668 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.256595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:18.256833 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:18.256778 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:18.256900 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:18.256855 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:19.256836303 +0000 UTC m=+4.054598915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:18.351726 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:18.351697 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da714b4_f642_4144_b84b_7f4d1f4cbe60.slice/crio-d9d257c47834f988cb2db36768b2a31c6eb831ae2fdcd72a0893830216a8a048 WatchSource:0}: Error finding container d9d257c47834f988cb2db36768b2a31c6eb831ae2fdcd72a0893830216a8a048: Status 404 returned error can't find the container with id d9d257c47834f988cb2db36768b2a31c6eb831ae2fdcd72a0893830216a8a048 Apr 22 14:15:18.353542 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:18.353492 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79402404_45d1_4ad9_b411_5640bfc88875.slice/crio-a175c812462465850b7b2efe253535e9ba99822f448fcf40bee927b94248a222 WatchSource:0}: Error finding container a175c812462465850b7b2efe253535e9ba99822f448fcf40bee927b94248a222: Status 404 returned error can't find the container with id a175c812462465850b7b2efe253535e9ba99822f448fcf40bee927b94248a222 Apr 22 14:15:18.354196 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:18.354084 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod992d1820_8cec_4643_a6fc_96f13a95fd10.slice/crio-49adbf1200f36d1662eb59a2e6889d978afb7de3b6f1a827151a5cdeba03a530 WatchSource:0}: Error finding container 49adbf1200f36d1662eb59a2e6889d978afb7de3b6f1a827151a5cdeba03a530: Status 404 returned error can't find the container with id 49adbf1200f36d1662eb59a2e6889d978afb7de3b6f1a827151a5cdeba03a530 Apr 22 14:15:18.355763 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:18.355742 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f603cf6_03b7_4b6b_aa45_5650e8076be3.slice/crio-dd82ab43cc4eb5e4bcb3a9a78afceb805ca92ab34dcd1e73dab18437af0000ac WatchSource:0}: Error finding container dd82ab43cc4eb5e4bcb3a9a78afceb805ca92ab34dcd1e73dab18437af0000ac: Status 404 returned error can't find the container with id dd82ab43cc4eb5e4bcb3a9a78afceb805ca92ab34dcd1e73dab18437af0000ac Apr 22 14:15:18.356521 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:18.356497 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4513952a_c2d3_4d8d_9bf2_ff87d4885b58.slice/crio-a7fbaf15d681eb6b95484bbdb60d38f34e8ba66f5d09f79a62fecb9c2a5111ce WatchSource:0}: Error finding container a7fbaf15d681eb6b95484bbdb60d38f34e8ba66f5d09f79a62fecb9c2a5111ce: Status 404 returned error can't find the container with id a7fbaf15d681eb6b95484bbdb60d38f34e8ba66f5d09f79a62fecb9c2a5111ce Apr 22 14:15:18.358733 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:18.358636 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c47887_6a82_4f97_8cf9_82dee1757b34.slice/crio-ca5ff647fe5c96336223d2c11d06a70c58ca0372fb7d79d6a63ce3bc1622893a WatchSource:0}: Error finding container ca5ff647fe5c96336223d2c11d06a70c58ca0372fb7d79d6a63ce3bc1622893a: Status 404 returned error can't find the container with id ca5ff647fe5c96336223d2c11d06a70c58ca0372fb7d79d6a63ce3bc1622893a Apr 22 14:15:18.360442 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:18.360235 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e11385_11b5_468d_8f37_f0a6251cf9f8.slice/crio-49f9c59b3feff4ba4511d6c1c47ae1072cdeda3ed16cb5a33eb0d51f7b34c32f WatchSource:0}: Error finding container 49f9c59b3feff4ba4511d6c1c47ae1072cdeda3ed16cb5a33eb0d51f7b34c32f: Status 404 returned error can't find the container with id 49f9c59b3feff4ba4511d6c1c47ae1072cdeda3ed16cb5a33eb0d51f7b34c32f Apr 22 14:15:18.361116 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:18.361019 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3885c220_9472_43c1_825a_2352438bbb35.slice/crio-a64a133c9dd1ea1cb38858f3b6bf2154a8a36641b90953aefa7f29d4685ad555 WatchSource:0}: Error finding container a64a133c9dd1ea1cb38858f3b6bf2154a8a36641b90953aefa7f29d4685ad555: Status 404 returned error can't find the container with id a64a133c9dd1ea1cb38858f3b6bf2154a8a36641b90953aefa7f29d4685ad555 Apr 22 14:15:18.457632 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.457608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:18.457794 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:18.457776 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:18.457844 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:18.457797 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:18.457844 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:18.457806 2573 projected.go:194] Error preparing data for projected volume kube-api-access-csqx8 for pod openshift-network-diagnostics/network-check-target-mj694: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:18.457909 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:18.457848 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8 podName:083554f0-d10f-417b-ac2a-68e07b68b98b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:19.457833488 +0000 UTC m=+4.255596091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-csqx8" (UniqueName: "kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8") pod "network-check-target-mj694" (UID: "083554f0-d10f-417b-ac2a-68e07b68b98b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:18.719426 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.719390 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:16 +0000 UTC" deadline="2028-01-19 00:04:52.42912406 +0000 UTC" Apr 22 14:15:18.719426 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.719421 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15273h49m33.709705539s" Apr 22 14:15:18.773905 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.773199 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:18.773905 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:18.773314 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:18.773905 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.773751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:18.773905 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:18.773855 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:18.782219 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.782174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vm6ts" event={"ID":"b6c47887-6a82-4f97-8cf9-82dee1757b34","Type":"ContainerStarted","Data":"ca5ff647fe5c96336223d2c11d06a70c58ca0372fb7d79d6a63ce3bc1622893a"} Apr 22 14:15:18.783750 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.783677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" event={"ID":"4513952a-c2d3-4d8d-9bf2-ff87d4885b58","Type":"ContainerStarted","Data":"a7fbaf15d681eb6b95484bbdb60d38f34e8ba66f5d09f79a62fecb9c2a5111ce"} Apr 22 14:15:18.785062 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.785020 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cvntg" event={"ID":"0f603cf6-03b7-4b6b-aa45-5650e8076be3","Type":"ContainerStarted","Data":"dd82ab43cc4eb5e4bcb3a9a78afceb805ca92ab34dcd1e73dab18437af0000ac"} Apr 22 14:15:18.788272 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.788218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"a175c812462465850b7b2efe253535e9ba99822f448fcf40bee927b94248a222"} Apr 22 14:15:18.794626 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.794136 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" event={"ID":"ff0e9bc99f6afd5adc8facc8707d2004","Type":"ContainerStarted","Data":"a030b6059dd3766ad3795f30e6c9c79472b5c474dcd805535191d31e38f2207d"} Apr 22 14:15:18.798594 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.798571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" event={"ID":"26e11385-11b5-468d-8f37-f0a6251cf9f8","Type":"ContainerStarted","Data":"49f9c59b3feff4ba4511d6c1c47ae1072cdeda3ed16cb5a33eb0d51f7b34c32f"} Apr 22 14:15:18.802706 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.802251 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l2j5r" event={"ID":"992d1820-8cec-4643-a6fc-96f13a95fd10","Type":"ContainerStarted","Data":"49adbf1200f36d1662eb59a2e6889d978afb7de3b6f1a827151a5cdeba03a530"} Apr 22 14:15:18.804974 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.804696 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerStarted","Data":"d9d257c47834f988cb2db36768b2a31c6eb831ae2fdcd72a0893830216a8a048"} Apr 22 14:15:18.811032 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.810739 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-45.ec2.internal" podStartSLOduration=2.810724288 podStartE2EDuration="2.810724288s" podCreationTimestamp="2026-04-22 14:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:18.810653642 +0000 UTC m=+3.608416265" watchObservedRunningTime="2026-04-22 14:15:18.810724288 +0000 UTC m=+3.608486910" Apr 22 14:15:18.812978 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:18.812741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdw65" event={"ID":"3885c220-9472-43c1-825a-2352438bbb35","Type":"ContainerStarted","Data":"a64a133c9dd1ea1cb38858f3b6bf2154a8a36641b90953aefa7f29d4685ad555"} Apr 22 14:15:19.264265 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:19.264230 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:19.264448 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:19.264411 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:19.264518 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:19.264479 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:21.264459884 +0000 UTC m=+6.062222501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:19.465547 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:19.465465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:19.465733 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:19.465624 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:19.465733 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:19.465642 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:19.465733 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:19.465655 2573 projected.go:194] Error preparing data for projected volume kube-api-access-csqx8 for pod openshift-network-diagnostics/network-check-target-mj694: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:19.465733 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:19.465727 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8 podName:083554f0-d10f-417b-ac2a-68e07b68b98b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:21.465708644 +0000 UTC m=+6.263471259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-csqx8" (UniqueName: "kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8") pod "network-check-target-mj694" (UID: "083554f0-d10f-417b-ac2a-68e07b68b98b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:19.831046 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:19.831008 2573 generic.go:358] "Generic (PLEG): container finished" podID="1220a96b1f6f6de885f83afbd7a8dd79" containerID="a94cf2da0ce5e53ae5b79fec5261e2594f96c2f03c8b8a3f303dc80be94e3c16" exitCode=0 Apr 22 14:15:19.831581 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:19.831558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" event={"ID":"1220a96b1f6f6de885f83afbd7a8dd79","Type":"ContainerDied","Data":"a94cf2da0ce5e53ae5b79fec5261e2594f96c2f03c8b8a3f303dc80be94e3c16"} Apr 22 14:15:20.773391 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:20.773355 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:20.773583 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:20.773367 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:20.773583 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:20.773496 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:20.773583 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:20.773569 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:20.838526 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:20.837920 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" event={"ID":"1220a96b1f6f6de885f83afbd7a8dd79","Type":"ContainerStarted","Data":"c09967550416fc3a23c583d0a738b1cb5ebfbe4564667687d9555dfba1470401"} Apr 22 14:15:20.853047 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:20.852848 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-45.ec2.internal" podStartSLOduration=4.852830298 podStartE2EDuration="4.852830298s" podCreationTimestamp="2026-04-22 14:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:20.852803562 +0000 UTC m=+5.650566183" watchObservedRunningTime="2026-04-22 14:15:20.852830298 +0000 UTC m=+5.650592919" Apr 22 14:15:21.280597 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.280542 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:21.280791 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:21.280658 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:21.280791 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:21.280746 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:25.280724709 +0000 UTC m=+10.078487311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:21.430241 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.429430 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-b256j"] Apr 22 14:15:21.433836 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.433812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.439851 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.439802 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rxc75\"" Apr 22 14:15:21.440102 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.440066 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:21.440588 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.440567 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:21.481999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.481966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:21.482154 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.482014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04c378f7-681d-435b-8a89-47348177d100-hosts-file\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.482154 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.482032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvjts\" (UniqueName: \"kubernetes.io/projected/04c378f7-681d-435b-8a89-47348177d100-kube-api-access-qvjts\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.482154 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.482100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/04c378f7-681d-435b-8a89-47348177d100-tmp-dir\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.482309 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:21.482230 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:21.482309 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:21.482246 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:21.482309 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:21.482260 2573 projected.go:194] Error preparing data for projected volume kube-api-access-csqx8 for pod openshift-network-diagnostics/network-check-target-mj694: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:21.482309 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:21.482309 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8 podName:083554f0-d10f-417b-ac2a-68e07b68b98b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:25.482293036 +0000 UTC m=+10.280055640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-csqx8" (UniqueName: "kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8") pod "network-check-target-mj694" (UID: "083554f0-d10f-417b-ac2a-68e07b68b98b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:21.582832 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.582752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/04c378f7-681d-435b-8a89-47348177d100-tmp-dir\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.582832 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.582818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04c378f7-681d-435b-8a89-47348177d100-hosts-file\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.583037 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.582842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvjts\" (UniqueName: \"kubernetes.io/projected/04c378f7-681d-435b-8a89-47348177d100-kube-api-access-qvjts\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.583037 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.582951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04c378f7-681d-435b-8a89-47348177d100-hosts-file\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.583136 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.583085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/04c378f7-681d-435b-8a89-47348177d100-tmp-dir\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.593946 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.593903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvjts\" (UniqueName: \"kubernetes.io/projected/04c378f7-681d-435b-8a89-47348177d100-kube-api-access-qvjts\") pod \"node-resolver-b256j\" (UID: \"04c378f7-681d-435b-8a89-47348177d100\") " pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:21.745363 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:21.745322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b256j" Apr 22 14:15:22.773580 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:22.773539 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:22.774033 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:22.773681 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:22.774033 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:22.773931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:22.774139 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:22.774040 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:24.774058 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:24.774022 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:24.774502 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:24.774148 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:24.774502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:24.774183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:24.774502 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:24.774310 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:25.311700 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:25.311592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:25.311895 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:25.311734 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:25.311895 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:25.311804 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.311784335 +0000 UTC m=+18.109546943 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:25.513368 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:25.512673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:25.513368 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:25.512895 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:25.513368 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:25.512917 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:25.513368 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:25.512930 2573 projected.go:194] Error preparing data for projected volume kube-api-access-csqx8 for pod openshift-network-diagnostics/network-check-target-mj694: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:25.513368 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:25.512993 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8 podName:083554f0-d10f-417b-ac2a-68e07b68b98b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.512974824 +0000 UTC m=+18.310737447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-csqx8" (UniqueName: "kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8") pod "network-check-target-mj694" (UID: "083554f0-d10f-417b-ac2a-68e07b68b98b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:26.177630 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.177596 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2wkqh"] Apr 22 14:15:26.179545 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.179517 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.179670 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:26.179602 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:26.218295 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.218257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4bc0a54-af51-4f91-af28-fff86847e8d6-kubelet-config\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.218447 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.218314 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.218507 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.218435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4bc0a54-af51-4f91-af28-fff86847e8d6-dbus\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.319866 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.319822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4bc0a54-af51-4f91-af28-fff86847e8d6-kubelet-config\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.320025 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.319884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.320025 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.319934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4bc0a54-af51-4f91-af28-fff86847e8d6-dbus\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.320025 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.319964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4bc0a54-af51-4f91-af28-fff86847e8d6-kubelet-config\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.320206 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:26.320079 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:26.320206 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:26.320148 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret podName:d4bc0a54-af51-4f91-af28-fff86847e8d6 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:26.820129472 +0000 UTC m=+11.617892086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret") pod "global-pull-secret-syncer-2wkqh" (UID: "d4bc0a54-af51-4f91-af28-fff86847e8d6") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:26.320206 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.320081 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4bc0a54-af51-4f91-af28-fff86847e8d6-dbus\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.774058 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.774028 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:26.774225 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.774028 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:26.774278 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:26.774225 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:26.774345 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:26.774326 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:26.823948 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:26.823858 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:26.824119 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:26.823998 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:26.824119 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:26.824068 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret podName:d4bc0a54-af51-4f91-af28-fff86847e8d6 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:27.824049019 +0000 UTC m=+12.621811636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret") pod "global-pull-secret-syncer-2wkqh" (UID: "d4bc0a54-af51-4f91-af28-fff86847e8d6") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:27.774108 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:27.774078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:27.774549 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:27.774203 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:27.832348 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:27.832307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:27.832539 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:27.832466 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:27.832604 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:27.832548 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret podName:d4bc0a54-af51-4f91-af28-fff86847e8d6 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:29.832527608 +0000 UTC m=+14.630290223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret") pod "global-pull-secret-syncer-2wkqh" (UID: "d4bc0a54-af51-4f91-af28-fff86847e8d6") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:28.773661 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:28.773626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:28.773833 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:28.773627 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:28.773833 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:28.773800 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:28.773935 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:28.773889 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:29.773841 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:29.773808 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:29.774316 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:29.773935 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:29.848509 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:29.848469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:29.848704 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:29.848636 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:29.848764 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:29.848729 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret podName:d4bc0a54-af51-4f91-af28-fff86847e8d6 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.848711597 +0000 UTC m=+18.646474210 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret") pod "global-pull-secret-syncer-2wkqh" (UID: "d4bc0a54-af51-4f91-af28-fff86847e8d6") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:30.773860 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:30.773823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:30.774326 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:30.773947 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:30.774326 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:30.774015 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:30.774326 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:30.774120 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:31.773567 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:31.773522 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:31.773749 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:31.773654 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:32.773637 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:32.773605 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:32.774083 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:32.773614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:32.774083 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:32.773712 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:32.774083 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:32.773780 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:33.375101 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:33.375062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:33.375346 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.375200 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:33.375346 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.375264 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:49.375245441 +0000 UTC m=+34.173008044 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:33.576779 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:33.576738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:33.576970 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.576934 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:33.576970 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.576952 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:33.576970 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.576964 2573 projected.go:194] Error preparing data for projected volume kube-api-access-csqx8 for pod openshift-network-diagnostics/network-check-target-mj694: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:33.577120 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.577031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8 podName:083554f0-d10f-417b-ac2a-68e07b68b98b nodeName:}" failed. No retries permitted until 2026-04-22 14:15:49.577011429 +0000 UTC m=+34.374774035 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-csqx8" (UniqueName: "kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8") pod "network-check-target-mj694" (UID: "083554f0-d10f-417b-ac2a-68e07b68b98b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:33.776646 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:33.776612 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:33.777077 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.776755 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:33.878511 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:33.878473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:33.878680 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.878624 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:33.878758 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:33.878701 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret podName:d4bc0a54-af51-4f91-af28-fff86847e8d6 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:41.878672496 +0000 UTC m=+26.676435100 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret") pod "global-pull-secret-syncer-2wkqh" (UID: "d4bc0a54-af51-4f91-af28-fff86847e8d6") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:34.593626 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:34.593588 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c378f7_681d_435b_8a89_47348177d100.slice/crio-4e0d4f691d9f08b1a58b362b327b19a0e8c3e9f3dce9efcba2701fe92ec619cf WatchSource:0}: Error finding container 4e0d4f691d9f08b1a58b362b327b19a0e8c3e9f3dce9efcba2701fe92ec619cf: Status 404 returned error can't find the container with id 4e0d4f691d9f08b1a58b362b327b19a0e8c3e9f3dce9efcba2701fe92ec619cf Apr 22 14:15:34.777256 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.773525 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:34.777256 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:34.773648 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:34.777256 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.774101 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:34.777256 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:34.774226 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:34.865406 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.865346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b256j" event={"ID":"04c378f7-681d-435b-8a89-47348177d100","Type":"ContainerStarted","Data":"4e0d4f691d9f08b1a58b362b327b19a0e8c3e9f3dce9efcba2701fe92ec619cf"} Apr 22 14:15:34.868414 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.868035 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdw65" event={"ID":"3885c220-9472-43c1-825a-2352438bbb35","Type":"ContainerStarted","Data":"ae373ea0695cd662b7515aa20c9784ed4393cc9e0594a65d0eec6d58cb35ae22"} Apr 22 14:15:34.877387 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.877356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" event={"ID":"4513952a-c2d3-4d8d-9bf2-ff87d4885b58","Type":"ContainerStarted","Data":"ab921f6d2f4d68515b519da46665adf11e7013e6781c1631cebc3701c7db9b80"} Apr 22 14:15:34.881420 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.879735 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cvntg" event={"ID":"0f603cf6-03b7-4b6b-aa45-5650e8076be3","Type":"ContainerStarted","Data":"6f4e0dd1a9ff596d67bde5570ea819e088eded8d3dfdd9d8314dc1f522752543"} Apr 22 14:15:34.883565 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.883538 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"1dc46d28df129ccdb9d6ff1d68011aaf74b6a88545807ca49dd7f544bdb88357"} Apr 22 14:15:34.885616 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.885593 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" event={"ID":"26e11385-11b5-468d-8f37-f0a6251cf9f8","Type":"ContainerStarted","Data":"c3981e65c5b2cacdff4bc5cf70c8cc1a9b295521da6cb79fe613131eb9a35bb0"} Apr 22 14:15:34.888280 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.888234 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bdw65" podStartSLOduration=2.600013417 podStartE2EDuration="18.888218651s" podCreationTimestamp="2026-04-22 14:15:16 +0000 UTC" firstStartedPulling="2026-04-22 14:15:18.362960567 +0000 UTC m=+3.160723167" lastFinishedPulling="2026-04-22 14:15:34.651165798 +0000 UTC m=+19.448928401" observedRunningTime="2026-04-22 14:15:34.88794817 +0000 UTC m=+19.685710792" watchObservedRunningTime="2026-04-22 14:15:34.888218651 +0000 UTC m=+19.685981273" Apr 22 14:15:34.899507 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.899465 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cvntg" podStartSLOduration=3.6281567470000002 podStartE2EDuration="19.899439459s" podCreationTimestamp="2026-04-22 14:15:15 +0000 UTC" firstStartedPulling="2026-04-22 14:15:18.357552697 +0000 UTC m=+3.155315304" lastFinishedPulling="2026-04-22 14:15:34.628835401 +0000 UTC m=+19.426598016" observedRunningTime="2026-04-22 14:15:34.899355335 +0000 UTC m=+19.697117957" watchObservedRunningTime="2026-04-22 14:15:34.899439459 +0000 UTC m=+19.697202079" Apr 22 14:15:34.918441 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:34.918389 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5hdwm" podStartSLOduration=3.651692168 podStartE2EDuration="19.918371005s" podCreationTimestamp="2026-04-22 14:15:15 +0000 UTC" firstStartedPulling="2026-04-22 14:15:18.362119138 +0000 UTC m=+3.159881754" lastFinishedPulling="2026-04-22 14:15:34.628797984 +0000 UTC m=+19.426560591" observedRunningTime="2026-04-22 14:15:34.91785683 +0000 UTC m=+19.715619452" watchObservedRunningTime="2026-04-22 14:15:34.918371005 +0000 UTC m=+19.716133626" Apr 22 14:15:35.714595 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.714408 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:35.774580 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.774545 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:35.774778 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:35.774626 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:35.888672 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.888582 2573 generic.go:358] "Generic (PLEG): container finished" podID="7da714b4-f642-4144-b84b-7f4d1f4cbe60" containerID="a9805b5bfd3b20d2a33fbe8f36453e34a66fd9a7c78762d97d52ed0413957778" exitCode=0 Apr 22 14:15:35.888672 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.888653 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerDied","Data":"a9805b5bfd3b20d2a33fbe8f36453e34a66fd9a7c78762d97d52ed0413957778"} Apr 22 14:15:35.889958 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.889931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b256j" event={"ID":"04c378f7-681d-435b-8a89-47348177d100","Type":"ContainerStarted","Data":"52d1f4ba2b4d40e2e513b8a7aac8716eb1c910e263db21f0800a7bfd41c9922f"} Apr 22 14:15:35.891160 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.891135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vm6ts" event={"ID":"b6c47887-6a82-4f97-8cf9-82dee1757b34","Type":"ContainerStarted","Data":"94e0b6e67fed9a6ca58b6308517f3caba4aad4e04d875590a50ea311f6a355ce"} Apr 22 14:15:35.892739 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.892711 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" event={"ID":"4513952a-c2d3-4d8d-9bf2-ff87d4885b58","Type":"ContainerStarted","Data":"ed99789e7c38f8d3bbfa359598819479d36c6dac5d6399851705bcc64a3f578a"} Apr 22 14:15:35.894865 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.894848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:15:35.895119 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.895100 2573 generic.go:358] "Generic (PLEG): container finished" podID="79402404-45d1-4ad9-b411-5640bfc88875" containerID="eb64b784f9b31db2b6bbd2a7d4c58f64b6294abf2efa6ab58c0f4b1b1c4c3aca" exitCode=1 Apr 22 14:15:35.895176 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.895135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"1eecd7a431b50cecc16af585aba055a40345cbca1cce006bcb29eefcadc5edb5"} Apr 22 14:15:35.895176 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.895160 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"9854c8001a969495853c3cd2d1db7fd3b46fe3afa04a362bb1ea10cdcdf03e9a"} Apr 22 14:15:35.895176 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.895172 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"c219f87741a77edc7ead66a372e76295c6bd2632bdc088ae3e1bb5ae353d22bc"} Apr 22 14:15:35.895289 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.895184 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"e8f557649465116c2fdc6c0e40cc54ae789ef9378bac4af3913e7c0ab944e093"} Apr 22 14:15:35.895289 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.895211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerDied","Data":"eb64b784f9b31db2b6bbd2a7d4c58f64b6294abf2efa6ab58c0f4b1b1c4c3aca"} Apr 22 14:15:35.916771 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.916727 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vm6ts" podStartSLOduration=4.647649543 podStartE2EDuration="20.916714641s" podCreationTimestamp="2026-04-22 14:15:15 +0000 UTC" firstStartedPulling="2026-04-22 14:15:18.360284605 +0000 UTC m=+3.158047215" lastFinishedPulling="2026-04-22 14:15:34.629349713 +0000 UTC m=+19.427112313" observedRunningTime="2026-04-22 14:15:35.916370636 +0000 UTC m=+20.714133259" watchObservedRunningTime="2026-04-22 14:15:35.916714641 +0000 UTC m=+20.714477262" Apr 22 14:15:35.929301 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:35.929266 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b256j" podStartSLOduration=14.929253969 podStartE2EDuration="14.929253969s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:35.92893459 +0000 UTC m=+20.726697210" watchObservedRunningTime="2026-04-22 14:15:35.929253969 +0000 UTC m=+20.727016590" Apr 22 14:15:36.148303 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.148218 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:36.148996 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.148972 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:36.706620 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.706326 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:35.714592617Z","UUID":"0194b262-f486-4e42-9632-b722d211fca1","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:36.709079 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.709053 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:36.709079 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.709087 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:36.773856 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.773823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:36.774025 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.773823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:36.774025 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:36.773955 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:36.774025 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:36.773988 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:36.898778 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.898733 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l2j5r" event={"ID":"992d1820-8cec-4643-a6fc-96f13a95fd10","Type":"ContainerStarted","Data":"5c977df4cf8c0746a599b6b680115bb3d4c558dae7bf459563ade6c6d0b66d29"} Apr 22 14:15:36.900824 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.900795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" event={"ID":"4513952a-c2d3-4d8d-9bf2-ff87d4885b58","Type":"ContainerStarted","Data":"4a136602e321a3e597b0a870433497ec2e76adaafade0978919ac56f10e5b127"} Apr 22 14:15:36.911143 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.911090 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-l2j5r" podStartSLOduration=5.639058438 podStartE2EDuration="21.911072658s" podCreationTimestamp="2026-04-22 14:15:15 +0000 UTC" firstStartedPulling="2026-04-22 14:15:18.35678272 +0000 UTC m=+3.154545325" lastFinishedPulling="2026-04-22 14:15:34.628796929 +0000 UTC m=+19.426559545" observedRunningTime="2026-04-22 14:15:36.910444494 +0000 UTC m=+21.708207115" watchObservedRunningTime="2026-04-22 14:15:36.911072658 +0000 UTC m=+21.708835281" Apr 22 14:15:36.923749 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:36.923678 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzx7j" podStartSLOduration=3.813706635 podStartE2EDuration="21.923660992s" podCreationTimestamp="2026-04-22 14:15:15 +0000 UTC" firstStartedPulling="2026-04-22 14:15:18.358922858 +0000 UTC m=+3.156685471" lastFinishedPulling="2026-04-22 14:15:36.468877221 +0000 UTC m=+21.266639828" observedRunningTime="2026-04-22 14:15:36.92360876 +0000 UTC m=+21.721371389" watchObservedRunningTime="2026-04-22 14:15:36.923660992 +0000 UTC m=+21.721423599" Apr 22 14:15:37.773437 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:37.773398 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:37.773628 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:37.773532 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:37.905323 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:37.905299 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:15:37.905842 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:37.905712 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"35d387062d89f1a3262751933ae052bd4afeec505b9315132f39625b54bd31bf"} Apr 22 14:15:37.905842 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:37.905766 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 14:15:38.773481 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:38.773449 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:38.773645 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:38.773449 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:38.773645 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:38.773558 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:38.773783 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:38.773643 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:39.774229 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:39.774208 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:39.774727 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:39.774302 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:39.912860 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:39.912673 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:15:39.913240 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:39.913197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"8ca0ed56297cf11220d847a9e2117d826679897530b31543160b5ae293c51c19"} Apr 22 14:15:40.773130 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:40.773099 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:40.773287 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:40.773105 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:40.773287 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:40.773199 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:40.773458 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:40.773298 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:40.916348 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:40.916315 2573 generic.go:358] "Generic (PLEG): container finished" podID="7da714b4-f642-4144-b84b-7f4d1f4cbe60" containerID="76998f53588a29694b0bb6d85ce0aead368269bd1775e3df38be80a24b2b5973" exitCode=0 Apr 22 14:15:40.916814 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:40.916395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerDied","Data":"76998f53588a29694b0bb6d85ce0aead368269bd1775e3df38be80a24b2b5973"} Apr 22 14:15:40.916814 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:40.916775 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:40.917545 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:40.916915 2573 scope.go:117] "RemoveContainer" containerID="eb64b784f9b31db2b6bbd2a7d4c58f64b6294abf2efa6ab58c0f4b1b1c4c3aca" Apr 22 14:15:40.931205 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:40.931185 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:41.728423 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.728383 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:41.773948 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.773911 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:41.774096 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:41.774034 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:41.809106 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.809080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2wkqh"] Apr 22 14:15:41.811962 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.811934 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mj694"] Apr 22 14:15:41.812084 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.812052 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:41.812153 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:41.812134 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:41.812802 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.812773 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qh8tk"] Apr 22 14:15:41.812919 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.812875 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:41.812974 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:41.812957 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:41.920942 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.920912 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:15:41.921374 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.921244 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:41.921374 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.921250 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" event={"ID":"79402404-45d1-4ad9-b411-5640bfc88875","Type":"ContainerStarted","Data":"be3b840e4a47e84a911c335f520dfe7aca8614eae96e8b9db9b49b604e36885e"} Apr 22 14:15:41.921374 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:41.921358 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:41.921517 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.921484 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:41.935207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.935183 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:15:41.943438 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.943419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:41.943598 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:41.943579 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:41.943671 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:41.943662 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret podName:d4bc0a54-af51-4f91-af28-fff86847e8d6 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:57.943643489 +0000 UTC m=+42.741406100 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret") pod "global-pull-secret-syncer-2wkqh" (UID: "d4bc0a54-af51-4f91-af28-fff86847e8d6") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:41.947461 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:41.947426 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" podStartSLOduration=9.618941701 podStartE2EDuration="25.94741628s" podCreationTimestamp="2026-04-22 14:15:16 +0000 UTC" firstStartedPulling="2026-04-22 14:15:18.355577267 +0000 UTC m=+3.153339868" lastFinishedPulling="2026-04-22 14:15:34.684051833 +0000 UTC m=+19.481814447" observedRunningTime="2026-04-22 14:15:41.947227612 +0000 UTC m=+26.744990233" watchObservedRunningTime="2026-04-22 14:15:41.94741628 +0000 UTC m=+26.745178951" Apr 22 14:15:42.925248 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:42.924997 2573 generic.go:358] "Generic (PLEG): container finished" podID="7da714b4-f642-4144-b84b-7f4d1f4cbe60" containerID="534cce9cd0c4f51695c39d4918ab8aa109de18712536b2f9f8dfad49d8873e60" exitCode=0 Apr 22 14:15:42.925596 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:42.925081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerDied","Data":"534cce9cd0c4f51695c39d4918ab8aa109de18712536b2f9f8dfad49d8873e60"} Apr 22 14:15:43.773615 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:43.773586 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:43.773615 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:43.773603 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:43.773834 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:43.773589 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:43.773834 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:43.773709 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:43.773834 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:43.773767 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:43.773834 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:43.773816 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:44.931491 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:44.931458 2573 generic.go:358] "Generic (PLEG): container finished" podID="7da714b4-f642-4144-b84b-7f4d1f4cbe60" containerID="e8058bdc0e68651df7cb7f347d0e9b7b0b3ffecee6a907dd9b3303ef0b97ab68" exitCode=0 Apr 22 14:15:44.932113 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:44.931527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerDied","Data":"e8058bdc0e68651df7cb7f347d0e9b7b0b3ffecee6a907dd9b3303ef0b97ab68"} Apr 22 14:15:45.774681 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:45.774643 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:45.774873 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:45.774758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:45.774873 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:45.774796 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:45.774873 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:45.774796 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wkqh" podUID="d4bc0a54-af51-4f91-af28-fff86847e8d6" Apr 22 14:15:45.775023 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:45.774884 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:15:45.775023 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:45.774954 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mj694" podUID="083554f0-d10f-417b-ac2a-68e07b68b98b" Apr 22 14:15:46.865960 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:46.865927 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:46.866489 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:46.866071 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 14:15:46.866553 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:46.866538 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cvntg" Apr 22 14:15:47.571191 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.571101 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-45.ec2.internal" event="NodeReady" Apr 22 14:15:47.571358 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.571257 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:15:47.616909 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.616879 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v"] Apr 22 14:15:47.638330 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.638295 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-b7rqq"] Apr 22 14:15:47.638680 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.638646 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.641113 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.641095 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:15:47.641229 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.641114 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:15:47.641229 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.641201 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:15:47.641396 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.641331 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-x72w4\"" Apr 22 14:15:47.646039 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.646017 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:15:47.653430 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.653333 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v"] Apr 22 14:15:47.653430 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.653363 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kmxlt"] Apr 22 14:15:47.653430 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.653429 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.655296 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.655278 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:15:47.655406 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.655310 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpmwg\"" Apr 22 14:15:47.655508 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.655494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:15:47.679828 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.679804 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kmxlt"] Apr 22 14:15:47.679828 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.679830 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b7rqq"] Apr 22 14:15:47.679988 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.679931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:47.682706 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.682667 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:15:47.683108 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.683083 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:15:47.683212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.683195 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:15:47.683340 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.683322 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lthk8\"" Apr 22 14:15:47.773706 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.773666 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:47.773855 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.773666 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:47.773915 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.773666 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:47.776218 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.776194 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:15:47.776353 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.776285 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:15:47.776353 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.776320 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:15:47.776353 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.776348 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jjvkf\"" Apr 22 14:15:47.776353 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.776286 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:15:47.776563 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.776509 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mfkfw\"" Apr 22 14:15:47.792077 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-installation-pull-secrets\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.792205 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792082 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvt8r\" (UniqueName: \"kubernetes.io/projected/6425daea-30dc-424d-a7ff-d63860240eee-kube-api-access-dvt8r\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:47.792205 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792114 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.792205 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-certificates\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.792205 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792173 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-bound-sa-token\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.792421 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjmqs\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-kube-api-access-fjmqs\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.792421 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.792421 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e58ca55-5638-4845-9cb8-959ad4a0d61f-ca-trust-extracted\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.792421 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-trusted-ca\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.792608 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cab656ec-aafc-44d5-bcf4-998f7334f612-tmp-dir\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.792608 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:47.792608 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smr62\" (UniqueName: \"kubernetes.io/projected/cab656ec-aafc-44d5-bcf4-998f7334f612-kube-api-access-smr62\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.792608 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-image-registry-private-configuration\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.792608 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.792600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cab656ec-aafc-44d5-bcf4-998f7334f612-config-volume\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.893220 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:47.893220 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893203 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smr62\" (UniqueName: \"kubernetes.io/projected/cab656ec-aafc-44d5-bcf4-998f7334f612-kube-api-access-smr62\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:47.893239 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-image-registry-private-configuration\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cab656ec-aafc-44d5-bcf4-998f7334f612-config-volume\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:47.893317 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert podName:6425daea-30dc-424d-a7ff-d63860240eee nodeName:}" failed. No retries permitted until 2026-04-22 14:15:48.393296117 +0000 UTC m=+33.191058731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert") pod "ingress-canary-kmxlt" (UID: "6425daea-30dc-424d-a7ff-d63860240eee") : secret "canary-serving-cert" not found Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-installation-pull-secrets\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvt8r\" (UniqueName: \"kubernetes.io/projected/6425daea-30dc-424d-a7ff-d63860240eee-kube-api-access-dvt8r\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893462 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-certificates\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-bound-sa-token\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjmqs\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-kube-api-access-fjmqs\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e58ca55-5638-4845-9cb8-959ad4a0d61f-ca-trust-extracted\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893611 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-trusted-ca\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cab656ec-aafc-44d5-bcf4-998f7334f612-tmp-dir\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:47.893633 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:47.893651 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v: secret "image-registry-tls" not found Apr 22 14:15:47.893736 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:47.893718 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls podName:5e58ca55-5638-4845-9cb8-959ad4a0d61f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:48.393682064 +0000 UTC m=+33.191444669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls") pod "image-registry-5dbc4fdcb4-5wb6v" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f") : secret "image-registry-tls" not found Apr 22 14:15:47.894529 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.893931 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cab656ec-aafc-44d5-bcf4-998f7334f612-tmp-dir\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.894529 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:47.894017 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:47.894529 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:47.894056 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls podName:cab656ec-aafc-44d5-bcf4-998f7334f612 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:48.394043411 +0000 UTC m=+33.191806015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls") pod "dns-default-b7rqq" (UID: "cab656ec-aafc-44d5-bcf4-998f7334f612") : secret "dns-default-metrics-tls" not found Apr 22 14:15:47.894529 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.894343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e58ca55-5638-4845-9cb8-959ad4a0d61f-ca-trust-extracted\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.894529 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.894368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cab656ec-aafc-44d5-bcf4-998f7334f612-config-volume\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.894750 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.894537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-certificates\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.895077 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.894948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-trusted-ca\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.898155 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.898119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-image-registry-private-configuration\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.898285 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.898163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-installation-pull-secrets\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.906062 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.905975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smr62\" (UniqueName: \"kubernetes.io/projected/cab656ec-aafc-44d5-bcf4-998f7334f612-kube-api-access-smr62\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:47.906062 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.905971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-bound-sa-token\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.906229 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.906105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjmqs\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-kube-api-access-fjmqs\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:47.906291 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:47.906270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvt8r\" (UniqueName: \"kubernetes.io/projected/6425daea-30dc-424d-a7ff-d63860240eee-kube-api-access-dvt8r\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:48.398288 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:48.398255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:48.398467 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:48.398313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:48.398467 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:48.398348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:48.398467 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:48.398418 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:48.398467 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:48.398418 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:48.398467 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:48.398421 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:48.398711 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:48.398477 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v: secret "image-registry-tls" not found Apr 22 14:15:48.398711 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:48.398488 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert podName:6425daea-30dc-424d-a7ff-d63860240eee nodeName:}" failed. No retries permitted until 2026-04-22 14:15:49.398467392 +0000 UTC m=+34.196229995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert") pod "ingress-canary-kmxlt" (UID: "6425daea-30dc-424d-a7ff-d63860240eee") : secret "canary-serving-cert" not found Apr 22 14:15:48.398711 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:48.398506 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls podName:cab656ec-aafc-44d5-bcf4-998f7334f612 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:49.398497738 +0000 UTC m=+34.196260341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls") pod "dns-default-b7rqq" (UID: "cab656ec-aafc-44d5-bcf4-998f7334f612") : secret "dns-default-metrics-tls" not found Apr 22 14:15:48.398711 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:48.398520 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls podName:5e58ca55-5638-4845-9cb8-959ad4a0d61f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:49.398513001 +0000 UTC m=+34.196275601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls") pod "image-registry-5dbc4fdcb4-5wb6v" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f") : secret "image-registry-tls" not found Apr 22 14:15:49.407038 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:49.406835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:49.407062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.406997 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.407132 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v: secret "image-registry-tls" not found Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:49.407101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:49.407179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.407203 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls podName:5e58ca55-5638-4845-9cb8-959ad4a0d61f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:51.407182815 +0000 UTC m=+36.204945431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls") pod "image-registry-5dbc4fdcb4-5wb6v" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f") : secret "image-registry-tls" not found Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.407210 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.407234 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.407262 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:16:21.407245179 +0000 UTC m=+66.205007784 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : secret "metrics-daemon-secret" not found Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.407290 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert podName:6425daea-30dc-424d-a7ff-d63860240eee nodeName:}" failed. No retries permitted until 2026-04-22 14:15:51.407280788 +0000 UTC m=+36.205043395 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert") pod "ingress-canary-kmxlt" (UID: "6425daea-30dc-424d-a7ff-d63860240eee") : secret "canary-serving-cert" not found Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.407325 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:49.407469 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:49.407370 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls podName:cab656ec-aafc-44d5-bcf4-998f7334f612 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:51.407355725 +0000 UTC m=+36.205118339 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls") pod "dns-default-b7rqq" (UID: "cab656ec-aafc-44d5-bcf4-998f7334f612") : secret "dns-default-metrics-tls" not found Apr 22 14:15:49.608971 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:49.608930 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:49.612087 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:49.612056 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csqx8\" (UniqueName: \"kubernetes.io/projected/083554f0-d10f-417b-ac2a-68e07b68b98b-kube-api-access-csqx8\") pod \"network-check-target-mj694\" (UID: \"083554f0-d10f-417b-ac2a-68e07b68b98b\") " pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:49.892788 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:49.892756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:50.552540 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:50.552511 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mj694"] Apr 22 14:15:50.666225 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:50.666150 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod083554f0_d10f_417b_ac2a_68e07b68b98b.slice/crio-1aef1d8c5bd7db02d316a96ef28894bbaeebefd4a3e3b8302d00e370535970b8 WatchSource:0}: Error finding container 1aef1d8c5bd7db02d316a96ef28894bbaeebefd4a3e3b8302d00e370535970b8: Status 404 returned error can't find the container with id 1aef1d8c5bd7db02d316a96ef28894bbaeebefd4a3e3b8302d00e370535970b8 Apr 22 14:15:50.946540 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:50.946506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerStarted","Data":"8a37b6ae8aa82bf673809612dd1438f041f5061500dcb24eeb383ab059be27fa"} Apr 22 14:15:50.947590 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:50.947563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mj694" event={"ID":"083554f0-d10f-417b-ac2a-68e07b68b98b","Type":"ContainerStarted","Data":"1aef1d8c5bd7db02d316a96ef28894bbaeebefd4a3e3b8302d00e370535970b8"} Apr 22 14:15:51.424963 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:51.424921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:51.425141 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:51.424986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:51.425141 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:51.425016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:51.425141 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:51.425096 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:51.425141 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:51.425120 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:51.425141 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:51.425130 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:51.425339 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:51.425121 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v: secret "image-registry-tls" not found Apr 22 14:15:51.425339 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:51.425181 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert podName:6425daea-30dc-424d-a7ff-d63860240eee nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.425164555 +0000 UTC m=+40.222927157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert") pod "ingress-canary-kmxlt" (UID: "6425daea-30dc-424d-a7ff-d63860240eee") : secret "canary-serving-cert" not found Apr 22 14:15:51.425339 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:51.425199 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls podName:5e58ca55-5638-4845-9cb8-959ad4a0d61f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.425187676 +0000 UTC m=+40.222950289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls") pod "image-registry-5dbc4fdcb4-5wb6v" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f") : secret "image-registry-tls" not found Apr 22 14:15:51.425339 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:51.425214 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls podName:cab656ec-aafc-44d5-bcf4-998f7334f612 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.425206568 +0000 UTC m=+40.222969181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls") pod "dns-default-b7rqq" (UID: "cab656ec-aafc-44d5-bcf4-998f7334f612") : secret "dns-default-metrics-tls" not found Apr 22 14:15:51.952247 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:51.952044 2573 generic.go:358] "Generic (PLEG): container finished" podID="7da714b4-f642-4144-b84b-7f4d1f4cbe60" containerID="8a37b6ae8aa82bf673809612dd1438f041f5061500dcb24eeb383ab059be27fa" exitCode=0 Apr 22 14:15:51.952611 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:51.952115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerDied","Data":"8a37b6ae8aa82bf673809612dd1438f041f5061500dcb24eeb383ab059be27fa"} Apr 22 14:15:52.957351 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:52.957318 2573 generic.go:358] "Generic (PLEG): container finished" podID="7da714b4-f642-4144-b84b-7f4d1f4cbe60" containerID="a52d18578a6e7e0117752b7b45b54127177a6fb78f9288c46c1fce7147db08f0" exitCode=0 Apr 22 14:15:52.957978 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:52.957369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerDied","Data":"a52d18578a6e7e0117752b7b45b54127177a6fb78f9288c46c1fce7147db08f0"} Apr 22 14:15:53.960561 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:53.960527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mj694" event={"ID":"083554f0-d10f-417b-ac2a-68e07b68b98b","Type":"ContainerStarted","Data":"18c05d0f59409b7c7899be73bdb02d2af4fb03c1e7783aff83228ca1f6e8439d"} Apr 22 14:15:53.960979 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:53.960600 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:15:53.963259 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:53.963236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" event={"ID":"7da714b4-f642-4144-b84b-7f4d1f4cbe60","Type":"ContainerStarted","Data":"0196ca71bbf08a9ce20a1b4c293c50c5c83bc485df5d6166b55af7a1a48ba12b"} Apr 22 14:15:53.977243 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:53.977201 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mj694" podStartSLOduration=34.851447781 podStartE2EDuration="37.977188303s" podCreationTimestamp="2026-04-22 14:15:16 +0000 UTC" firstStartedPulling="2026-04-22 14:15:50.673655298 +0000 UTC m=+35.471417902" lastFinishedPulling="2026-04-22 14:15:53.799395808 +0000 UTC m=+38.597158424" observedRunningTime="2026-04-22 14:15:53.976591385 +0000 UTC m=+38.774354005" watchObservedRunningTime="2026-04-22 14:15:53.977188303 +0000 UTC m=+38.774950923" Apr 22 14:15:53.998830 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:53.998778 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zbj7k" podStartSLOduration=5.657285838 podStartE2EDuration="37.998741299s" podCreationTimestamp="2026-04-22 14:15:16 +0000 UTC" firstStartedPulling="2026-04-22 14:15:18.353938367 +0000 UTC m=+3.151700966" lastFinishedPulling="2026-04-22 14:15:50.695393812 +0000 UTC m=+35.493156427" observedRunningTime="2026-04-22 14:15:53.997548822 +0000 UTC m=+38.795311443" watchObservedRunningTime="2026-04-22 14:15:53.998741299 +0000 UTC m=+38.796503923" Apr 22 14:15:55.456928 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:55.456886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:55.456977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:55.457026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:55.457043 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:55.457107 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert podName:6425daea-30dc-424d-a7ff-d63860240eee nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.457089092 +0000 UTC m=+48.254851715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert") pod "ingress-canary-kmxlt" (UID: "6425daea-30dc-424d-a7ff-d63860240eee") : secret "canary-serving-cert" not found Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:55.457108 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:55.457113 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:55.457129 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v: secret "image-registry-tls" not found Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:55.457144 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls podName:cab656ec-aafc-44d5-bcf4-998f7334f612 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.457133574 +0000 UTC m=+48.254896174 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls") pod "dns-default-b7rqq" (UID: "cab656ec-aafc-44d5-bcf4-998f7334f612") : secret "dns-default-metrics-tls" not found Apr 22 14:15:55.457354 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:15:55.457184 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls podName:5e58ca55-5638-4845-9cb8-959ad4a0d61f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.457170266 +0000 UTC m=+48.254932865 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls") pod "image-registry-5dbc4fdcb4-5wb6v" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f") : secret "image-registry-tls" not found Apr 22 14:15:57.977124 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:57.977089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:57.980512 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:57.980476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4bc0a54-af51-4f91-af28-fff86847e8d6-original-pull-secret\") pod \"global-pull-secret-syncer-2wkqh\" (UID: \"d4bc0a54-af51-4f91-af28-fff86847e8d6\") " pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:57.998640 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:57.998609 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wkqh" Apr 22 14:15:58.112528 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:58.112492 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2wkqh"] Apr 22 14:15:58.118526 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:15:58.118497 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4bc0a54_af51_4f91_af28_fff86847e8d6.slice/crio-b04cbd06e50e81c3d36f13f23928449eb0adc8ec78e038c650d697e5701c1f8a WatchSource:0}: Error finding container b04cbd06e50e81c3d36f13f23928449eb0adc8ec78e038c650d697e5701c1f8a: Status 404 returned error can't find the container with id b04cbd06e50e81c3d36f13f23928449eb0adc8ec78e038c650d697e5701c1f8a Apr 22 14:15:58.973461 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:15:58.973421 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2wkqh" event={"ID":"d4bc0a54-af51-4f91-af28-fff86847e8d6","Type":"ContainerStarted","Data":"b04cbd06e50e81c3d36f13f23928449eb0adc8ec78e038c650d697e5701c1f8a"} Apr 22 14:16:02.982241 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:02.982204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2wkqh" event={"ID":"d4bc0a54-af51-4f91-af28-fff86847e8d6","Type":"ContainerStarted","Data":"f0f25846ba0a1052bd86aad9d13ab1dcd478e7785fbddf9fe46740d0b90bd278"} Apr 22 14:16:02.998227 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:02.998173 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2wkqh" podStartSLOduration=33.133069085 podStartE2EDuration="36.998158297s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="2026-04-22 14:15:58.120530772 +0000 UTC m=+42.918293372" lastFinishedPulling="2026-04-22 14:16:01.985619979 +0000 UTC m=+46.783382584" observedRunningTime="2026-04-22 14:16:02.998143474 +0000 UTC m=+47.795906096" watchObservedRunningTime="2026-04-22 14:16:02.998158297 +0000 UTC m=+47.795920916" Apr 22 14:16:03.513619 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:03.513579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:03.513651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:03.513680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:03.513708 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:03.513729 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v: secret "image-registry-tls" not found Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:03.513781 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls podName:5e58ca55-5638-4845-9cb8-959ad4a0d61f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:19.513764152 +0000 UTC m=+64.311526759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls") pod "image-registry-5dbc4fdcb4-5wb6v" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f") : secret "image-registry-tls" not found Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:03.513803 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:03.513814 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:03.513860 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert podName:6425daea-30dc-424d-a7ff-d63860240eee nodeName:}" failed. No retries permitted until 2026-04-22 14:16:19.513847738 +0000 UTC m=+64.311610356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert") pod "ingress-canary-kmxlt" (UID: "6425daea-30dc-424d-a7ff-d63860240eee") : secret "canary-serving-cert" not found Apr 22 14:16:03.513884 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:03.513872 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls podName:cab656ec-aafc-44d5-bcf4-998f7334f612 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:19.513867031 +0000 UTC m=+64.311629630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls") pod "dns-default-b7rqq" (UID: "cab656ec-aafc-44d5-bcf4-998f7334f612") : secret "dns-default-metrics-tls" not found Apr 22 14:16:13.937368 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:13.937340 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88vk" Apr 22 14:16:19.519398 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:19.519344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:16:19.519398 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:19.519405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:16:19.519842 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:19.519468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:16:19.519842 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:19.519503 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:19.519842 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:19.519556 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:19.519842 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:19.519566 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v: secret "image-registry-tls" not found Apr 22 14:16:19.519842 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:19.519572 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert podName:6425daea-30dc-424d-a7ff-d63860240eee nodeName:}" failed. No retries permitted until 2026-04-22 14:16:51.519556912 +0000 UTC m=+96.317319514 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert") pod "ingress-canary-kmxlt" (UID: "6425daea-30dc-424d-a7ff-d63860240eee") : secret "canary-serving-cert" not found Apr 22 14:16:19.519842 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:19.519507 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:19.519842 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:19.519615 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls podName:5e58ca55-5638-4845-9cb8-959ad4a0d61f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:51.519600983 +0000 UTC m=+96.317363583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls") pod "image-registry-5dbc4fdcb4-5wb6v" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f") : secret "image-registry-tls" not found Apr 22 14:16:19.519842 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:19.519638 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls podName:cab656ec-aafc-44d5-bcf4-998f7334f612 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:51.519624529 +0000 UTC m=+96.317387133 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls") pod "dns-default-b7rqq" (UID: "cab656ec-aafc-44d5-bcf4-998f7334f612") : secret "dns-default-metrics-tls" not found Apr 22 14:16:21.431381 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:21.431339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:16:21.431787 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:21.431482 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:21.431787 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:21.431541 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:17:25.43152528 +0000 UTC m=+130.229287896 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : secret "metrics-daemon-secret" not found Apr 22 14:16:24.966829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:24.966797 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mj694" Apr 22 14:16:51.548261 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:51.548206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:51.548275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:16:51.548297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:51.548365 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:51.548394 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v: secret "image-registry-tls" not found Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:51.548403 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:51.548369 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:51.548473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls podName:5e58ca55-5638-4845-9cb8-959ad4a0d61f nodeName:}" failed. No retries permitted until 2026-04-22 14:17:55.548453037 +0000 UTC m=+160.346215649 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls") pod "image-registry-5dbc4fdcb4-5wb6v" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f") : secret "image-registry-tls" not found Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:51.548487 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls podName:cab656ec-aafc-44d5-bcf4-998f7334f612 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:55.548481442 +0000 UTC m=+160.346244041 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls") pod "dns-default-b7rqq" (UID: "cab656ec-aafc-44d5-bcf4-998f7334f612") : secret "dns-default-metrics-tls" not found Apr 22 14:16:51.548633 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:16:51.548497 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert podName:6425daea-30dc-424d-a7ff-d63860240eee nodeName:}" failed. No retries permitted until 2026-04-22 14:17:55.548491613 +0000 UTC m=+160.346254212 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert") pod "ingress-canary-kmxlt" (UID: "6425daea-30dc-424d-a7ff-d63860240eee") : secret "canary-serving-cert" not found Apr 22 14:17:25.484698 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:25.484653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:17:25.485189 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:25.484802 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:17:25.485189 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:25.484868 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs podName:6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d nodeName:}" failed. No retries permitted until 2026-04-22 14:19:27.484852592 +0000 UTC m=+252.282615195 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs") pod "network-metrics-daemon-qh8tk" (UID: "6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d") : secret "metrics-daemon-secret" not found Apr 22 14:17:37.905097 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.905066 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2zk62"] Apr 22 14:17:37.907598 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.907583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:37.909583 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.909554 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 14:17:37.909583 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.909576 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:17:37.909782 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.909602 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:17:37.910413 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.910397 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 14:17:37.910530 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.910514 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-cvjd2\"" Apr 22 14:17:37.915375 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.915356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 14:17:37.915466 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:37.915388 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2zk62"] Apr 22 14:17:38.004271 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.004242 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj"] Apr 22 14:17:38.007069 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.007040 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.011167 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.011042 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 14:17:38.011167 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.011066 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:17:38.011167 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.011066 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:17:38.011378 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.011191 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 14:17:38.011378 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.011323 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-psw2g\"" Apr 22 14:17:38.018000 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.017977 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj"] Apr 22 14:17:38.072854 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.072835 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-snapshots\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.072935 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.072860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-service-ca-bundle\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.072935 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.072884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-tmp\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.072935 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.072900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-serving-cert\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.073052 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.072959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4t6c\" (UniqueName: \"kubernetes.io/projected/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-kube-api-access-b4t6c\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.073052 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.073011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.107261 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.107239 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g"] Apr 22 14:17:38.109788 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.109774 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm"] Apr 22 14:17:38.109916 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.109901 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g" Apr 22 14:17:38.111743 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.111726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-gsj2s\"" Apr 22 14:17:38.112365 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.112347 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm" Apr 22 14:17:38.116630 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.116614 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:38.117042 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.117023 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:38.117273 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.117259 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-tclw5\"" Apr 22 14:17:38.117493 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.117477 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj"] Apr 22 14:17:38.120125 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.120110 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:38.121989 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.121969 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 14:17:38.122490 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.122472 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:38.122596 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.122581 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8hcmp\"" Apr 22 14:17:38.122702 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.122670 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:38.126406 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.126377 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm"] Apr 22 14:17:38.130397 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.130381 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g"] Apr 22 14:17:38.138773 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.138726 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj"] Apr 22 14:17:38.173625 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.173565 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.173775 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.173757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-snapshots\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.173871 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.173790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-service-ca-bundle\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.173871 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.173830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-tmp\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.173871 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.173866 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-serving-cert\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.174014 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.173965 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/84e5fe39-c177-4874-a08b-cec368549879-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.174014 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.173995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4t6c\" (UniqueName: \"kubernetes.io/projected/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-kube-api-access-b4t6c\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.174115 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.174023 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.174243 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.174073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbft7\" (UniqueName: \"kubernetes.io/projected/84e5fe39-c177-4874-a08b-cec368549879-kube-api-access-nbft7\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.174323 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.174307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-service-ca-bundle\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.174565 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.174545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.174841 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.174824 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-tmp\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.174909 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.174839 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-snapshots\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.176467 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.176449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-serving-cert\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.181980 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.181958 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4t6c\" (UniqueName: \"kubernetes.io/projected/9de5847f-e323-4cd6-9aef-fde65fdaa5e2-kube-api-access-b4t6c\") pod \"insights-operator-585dfdc468-2zk62\" (UID: \"9de5847f-e323-4cd6-9aef-fde65fdaa5e2\") " pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.216138 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.216118 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-2zk62" Apr 22 14:17:38.216980 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.216957 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79"] Apr 22 14:17:38.220807 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.220790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.223496 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.223478 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-fchp7\"" Apr 22 14:17:38.223872 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.223856 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 14:17:38.224041 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.223903 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:38.224172 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.223906 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:38.224328 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.224310 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 14:17:38.233993 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.233975 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79"] Apr 22 14:17:38.275175 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.275140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/84e5fe39-c177-4874-a08b-cec368549879-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.275275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.275196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.275275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.275234 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:38.275275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.275270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcx9g\" (UniqueName: \"kubernetes.io/projected/dd765af3-73fe-4f78-9672-7301cb5a0352-kube-api-access-gcx9g\") pod \"network-check-source-8894fc9bd-8tn6g\" (UID: \"dd765af3-73fe-4f78-9672-7301cb5a0352\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g" Apr 22 14:17:38.275368 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.275339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfpt8\" (UniqueName: \"kubernetes.io/projected/3b3c692c-1e00-4b52-b92e-ee544698dbd8-kube-api-access-wfpt8\") pod \"volume-data-source-validator-7c6cbb6c87-pp5sm\" (UID: \"3b3c692c-1e00-4b52-b92e-ee544698dbd8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm" Apr 22 14:17:38.275400 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.275370 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcntz\" (UniqueName: \"kubernetes.io/projected/070b7b4b-554f-4feb-b7a5-5e8185d64f96-kube-api-access-bcntz\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:38.275444 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.275409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbft7\" (UniqueName: \"kubernetes.io/projected/84e5fe39-c177-4874-a08b-cec368549879-kube-api-access-nbft7\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.275647 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:38.275620 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:38.275764 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:38.275712 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls podName:84e5fe39-c177-4874-a08b-cec368549879 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:38.775672374 +0000 UTC m=+143.573434981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fv6qj" (UID: "84e5fe39-c177-4874-a08b-cec368549879") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:38.275912 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.275892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/84e5fe39-c177-4874-a08b-cec368549879-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.284160 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.284140 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbft7\" (UniqueName: \"kubernetes.io/projected/84e5fe39-c177-4874-a08b-cec368549879-kube-api-access-nbft7\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.335764 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.335737 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2zk62"] Apr 22 14:17:38.339381 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:17:38.339350 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de5847f_e323_4cd6_9aef_fde65fdaa5e2.slice/crio-0d679d62bc5bcdaa4ff810d3694cefa420888b7692043deb0c34103de21b2e0b WatchSource:0}: Error finding container 0d679d62bc5bcdaa4ff810d3694cefa420888b7692043deb0c34103de21b2e0b: Status 404 returned error can't find the container with id 0d679d62bc5bcdaa4ff810d3694cefa420888b7692043deb0c34103de21b2e0b Apr 22 14:17:38.376458 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.376439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfpt8\" (UniqueName: \"kubernetes.io/projected/3b3c692c-1e00-4b52-b92e-ee544698dbd8-kube-api-access-wfpt8\") pod \"volume-data-source-validator-7c6cbb6c87-pp5sm\" (UID: \"3b3c692c-1e00-4b52-b92e-ee544698dbd8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm" Apr 22 14:17:38.376568 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.376466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcntz\" (UniqueName: \"kubernetes.io/projected/070b7b4b-554f-4feb-b7a5-5e8185d64f96-kube-api-access-bcntz\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:38.376568 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.376487 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfb9086c-f831-4140-bf45-0520130af0ae-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.376568 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.376566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:38.376729 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.376597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-794xv\" (UniqueName: \"kubernetes.io/projected/dfb9086c-f831-4140-bf45-0520130af0ae-kube-api-access-794xv\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.376729 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.376638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcx9g\" (UniqueName: \"kubernetes.io/projected/dd765af3-73fe-4f78-9672-7301cb5a0352-kube-api-access-gcx9g\") pod \"network-check-source-8894fc9bd-8tn6g\" (UID: \"dd765af3-73fe-4f78-9672-7301cb5a0352\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g" Apr 22 14:17:38.376729 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:38.376702 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:38.376841 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.376740 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb9086c-f831-4140-bf45-0520130af0ae-config\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.376841 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:38.376761 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls podName:070b7b4b-554f-4feb-b7a5-5e8185d64f96 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:38.876742073 +0000 UTC m=+143.674504686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fjgkj" (UID: "070b7b4b-554f-4feb-b7a5-5e8185d64f96") : secret "samples-operator-tls" not found Apr 22 14:17:38.385417 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.385389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfpt8\" (UniqueName: \"kubernetes.io/projected/3b3c692c-1e00-4b52-b92e-ee544698dbd8-kube-api-access-wfpt8\") pod \"volume-data-source-validator-7c6cbb6c87-pp5sm\" (UID: \"3b3c692c-1e00-4b52-b92e-ee544698dbd8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm" Apr 22 14:17:38.385499 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.385426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcx9g\" (UniqueName: \"kubernetes.io/projected/dd765af3-73fe-4f78-9672-7301cb5a0352-kube-api-access-gcx9g\") pod \"network-check-source-8894fc9bd-8tn6g\" (UID: \"dd765af3-73fe-4f78-9672-7301cb5a0352\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g" Apr 22 14:17:38.385547 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.385532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcntz\" (UniqueName: \"kubernetes.io/projected/070b7b4b-554f-4feb-b7a5-5e8185d64f96-kube-api-access-bcntz\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:38.419354 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.419335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g" Apr 22 14:17:38.424859 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.424814 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm" Apr 22 14:17:38.478562 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.478017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-794xv\" (UniqueName: \"kubernetes.io/projected/dfb9086c-f831-4140-bf45-0520130af0ae-kube-api-access-794xv\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.478562 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.478061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb9086c-f831-4140-bf45-0520130af0ae-config\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.478562 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.478116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfb9086c-f831-4140-bf45-0520130af0ae-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.479315 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.479103 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb9086c-f831-4140-bf45-0520130af0ae-config\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.481424 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.481357 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfb9086c-f831-4140-bf45-0520130af0ae-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.488293 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.488269 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-794xv\" (UniqueName: \"kubernetes.io/projected/dfb9086c-f831-4140-bf45-0520130af0ae-kube-api-access-794xv\") pod \"service-ca-operator-d6fc45fc5-9pz79\" (UID: \"dfb9086c-f831-4140-bf45-0520130af0ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.531858 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.531830 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g"] Apr 22 14:17:38.534340 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:17:38.534317 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd765af3_73fe_4f78_9672_7301cb5a0352.slice/crio-9a4f79dab3c2639253dc9bd16d47c6af61e387e325f05ed5ca2c96db506e52c8 WatchSource:0}: Error finding container 9a4f79dab3c2639253dc9bd16d47c6af61e387e325f05ed5ca2c96db506e52c8: Status 404 returned error can't find the container with id 9a4f79dab3c2639253dc9bd16d47c6af61e387e325f05ed5ca2c96db506e52c8 Apr 22 14:17:38.539148 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.539131 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" Apr 22 14:17:38.546035 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.546016 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm"] Apr 22 14:17:38.548706 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:17:38.548664 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b3c692c_1e00_4b52_b92e_ee544698dbd8.slice/crio-4be9067d659213e41ef7e68707eda02caba0f8b4014f10b7a94d620b11695255 WatchSource:0}: Error finding container 4be9067d659213e41ef7e68707eda02caba0f8b4014f10b7a94d620b11695255: Status 404 returned error can't find the container with id 4be9067d659213e41ef7e68707eda02caba0f8b4014f10b7a94d620b11695255 Apr 22 14:17:38.653394 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.653364 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79"] Apr 22 14:17:38.656231 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:17:38.656195 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfb9086c_f831_4140_bf45_0520130af0ae.slice/crio-17f161e8af8985cd0cb9db3d5789397e262607b7585cb9409f4a50763f0297be WatchSource:0}: Error finding container 17f161e8af8985cd0cb9db3d5789397e262607b7585cb9409f4a50763f0297be: Status 404 returned error can't find the container with id 17f161e8af8985cd0cb9db3d5789397e262607b7585cb9409f4a50763f0297be Apr 22 14:17:38.781409 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.781375 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:38.781590 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:38.781564 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:38.781730 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:38.781657 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls podName:84e5fe39-c177-4874-a08b-cec368549879 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:39.781634382 +0000 UTC m=+144.579396990 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fv6qj" (UID: "84e5fe39-c177-4874-a08b-cec368549879") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:38.881785 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:38.881751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:38.881950 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:38.881929 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:38.882029 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:38.882018 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls podName:070b7b4b-554f-4feb-b7a5-5e8185d64f96 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:39.881996989 +0000 UTC m=+144.679759604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fjgkj" (UID: "070b7b4b-554f-4feb-b7a5-5e8185d64f96") : secret "samples-operator-tls" not found Apr 22 14:17:39.162418 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:39.161949 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2zk62" event={"ID":"9de5847f-e323-4cd6-9aef-fde65fdaa5e2","Type":"ContainerStarted","Data":"0d679d62bc5bcdaa4ff810d3694cefa420888b7692043deb0c34103de21b2e0b"} Apr 22 14:17:39.163999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:39.163966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" event={"ID":"dfb9086c-f831-4140-bf45-0520130af0ae","Type":"ContainerStarted","Data":"17f161e8af8985cd0cb9db3d5789397e262607b7585cb9409f4a50763f0297be"} Apr 22 14:17:39.165330 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:39.165260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm" event={"ID":"3b3c692c-1e00-4b52-b92e-ee544698dbd8","Type":"ContainerStarted","Data":"4be9067d659213e41ef7e68707eda02caba0f8b4014f10b7a94d620b11695255"} Apr 22 14:17:39.167321 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:39.167295 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g" event={"ID":"dd765af3-73fe-4f78-9672-7301cb5a0352","Type":"ContainerStarted","Data":"ba1dba541be060a511256d6c7dbada3a82e8976186ffe3899eab2b8e1fe199a3"} Apr 22 14:17:39.167431 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:39.167330 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g" event={"ID":"dd765af3-73fe-4f78-9672-7301cb5a0352","Type":"ContainerStarted","Data":"9a4f79dab3c2639253dc9bd16d47c6af61e387e325f05ed5ca2c96db506e52c8"} Apr 22 14:17:39.184865 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:39.184682 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8tn6g" podStartSLOduration=1.184664886 podStartE2EDuration="1.184664886s" podCreationTimestamp="2026-04-22 14:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:39.183157617 +0000 UTC m=+143.980920239" watchObservedRunningTime="2026-04-22 14:17:39.184664886 +0000 UTC m=+143.982427508" Apr 22 14:17:39.789180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:39.789140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:39.789358 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:39.789321 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:39.789474 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:39.789392 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls podName:84e5fe39-c177-4874-a08b-cec368549879 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.789372502 +0000 UTC m=+146.587135112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fv6qj" (UID: "84e5fe39-c177-4874-a08b-cec368549879") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:39.889875 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:39.889835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:39.890044 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:39.889986 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:39.890101 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:39.890060 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls podName:070b7b4b-554f-4feb-b7a5-5e8185d64f96 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.890041644 +0000 UTC m=+146.687804250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fjgkj" (UID: "070b7b4b-554f-4feb-b7a5-5e8185d64f96") : secret "samples-operator-tls" not found Apr 22 14:17:41.173863 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:41.173814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2zk62" event={"ID":"9de5847f-e323-4cd6-9aef-fde65fdaa5e2","Type":"ContainerStarted","Data":"886dba13d9250732e158f3646ed45eefa11f09d9982f2192151f8eabca4f41ad"} Apr 22 14:17:41.175337 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:41.175312 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" event={"ID":"dfb9086c-f831-4140-bf45-0520130af0ae","Type":"ContainerStarted","Data":"b3062bb7ee49f09c98ffbc003a9976e75079d923d0ac3153db9711e178d25245"} Apr 22 14:17:41.176518 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:41.176499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm" event={"ID":"3b3c692c-1e00-4b52-b92e-ee544698dbd8","Type":"ContainerStarted","Data":"bd1b6d3741be0cf55a92c952a575a21e60c7cd09ea1dad35ac9889fe09c677e4"} Apr 22 14:17:41.192374 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:41.192334 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-2zk62" podStartSLOduration=1.5561877050000001 podStartE2EDuration="4.192323828s" podCreationTimestamp="2026-04-22 14:17:37 +0000 UTC" firstStartedPulling="2026-04-22 14:17:38.340625801 +0000 UTC m=+143.138388400" lastFinishedPulling="2026-04-22 14:17:40.976761914 +0000 UTC m=+145.774524523" observedRunningTime="2026-04-22 14:17:41.192172249 +0000 UTC m=+145.989934869" watchObservedRunningTime="2026-04-22 14:17:41.192323828 +0000 UTC m=+145.990086450" Apr 22 14:17:41.211854 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:41.211783 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pp5sm" podStartSLOduration=0.785077107 podStartE2EDuration="3.2117721s" podCreationTimestamp="2026-04-22 14:17:38 +0000 UTC" firstStartedPulling="2026-04-22 14:17:38.550442102 +0000 UTC m=+143.348204720" lastFinishedPulling="2026-04-22 14:17:40.977137106 +0000 UTC m=+145.774899713" observedRunningTime="2026-04-22 14:17:41.21100785 +0000 UTC m=+146.008770470" watchObservedRunningTime="2026-04-22 14:17:41.2117721 +0000 UTC m=+146.009534765" Apr 22 14:17:41.227677 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:41.227639 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" podStartSLOduration=0.903801622 podStartE2EDuration="3.227628796s" podCreationTimestamp="2026-04-22 14:17:38 +0000 UTC" firstStartedPulling="2026-04-22 14:17:38.658071368 +0000 UTC m=+143.455833967" lastFinishedPulling="2026-04-22 14:17:40.981898529 +0000 UTC m=+145.779661141" observedRunningTime="2026-04-22 14:17:41.226989628 +0000 UTC m=+146.024752248" watchObservedRunningTime="2026-04-22 14:17:41.227628796 +0000 UTC m=+146.025391416" Apr 22 14:17:41.802166 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:41.802133 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:41.802328 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:41.802306 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:41.802399 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:41.802389 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls podName:84e5fe39-c177-4874-a08b-cec368549879 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:45.802370068 +0000 UTC m=+150.600132669 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fv6qj" (UID: "84e5fe39-c177-4874-a08b-cec368549879") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:41.903230 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:41.903191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:41.903407 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:41.903374 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:41.903471 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:41.903462 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls podName:070b7b4b-554f-4feb-b7a5-5e8185d64f96 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:45.903441566 +0000 UTC m=+150.701204177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fjgkj" (UID: "070b7b4b-554f-4feb-b7a5-5e8185d64f96") : secret "samples-operator-tls" not found Apr 22 14:17:43.981492 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:43.981462 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b256j_04c378f7-681d-435b-8a89-47348177d100/dns-node-resolver/0.log" Apr 22 14:17:44.696159 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.696126 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v7ql9"] Apr 22 14:17:44.700166 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.700151 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.714748 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.714730 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 14:17:44.714866 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.714765 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 14:17:44.716301 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.716281 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 14:17:44.717102 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.717084 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-6cxdf\"" Apr 22 14:17:44.717448 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.717426 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v7ql9"] Apr 22 14:17:44.718030 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.718015 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 14:17:44.726468 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.726448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9aa20c93-8349-4462-afe5-350655a5e6bb-signing-key\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.726553 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.726505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9aa20c93-8349-4462-afe5-350655a5e6bb-signing-cabundle\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.726553 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.726533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98wr\" (UniqueName: \"kubernetes.io/projected/9aa20c93-8349-4462-afe5-350655a5e6bb-kube-api-access-n98wr\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.827246 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.827210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9aa20c93-8349-4462-afe5-350655a5e6bb-signing-key\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.827434 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.827270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9aa20c93-8349-4462-afe5-350655a5e6bb-signing-cabundle\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.827434 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.827286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n98wr\" (UniqueName: \"kubernetes.io/projected/9aa20c93-8349-4462-afe5-350655a5e6bb-kube-api-access-n98wr\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.827978 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.827956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9aa20c93-8349-4462-afe5-350655a5e6bb-signing-cabundle\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.829698 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.829663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9aa20c93-8349-4462-afe5-350655a5e6bb-signing-key\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:44.838971 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:44.838951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98wr\" (UniqueName: \"kubernetes.io/projected/9aa20c93-8349-4462-afe5-350655a5e6bb-kube-api-access-n98wr\") pod \"service-ca-865cb79987-v7ql9\" (UID: \"9aa20c93-8349-4462-afe5-350655a5e6bb\") " pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:45.008601 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:45.008513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-v7ql9" Apr 22 14:17:45.128704 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:45.128654 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v7ql9"] Apr 22 14:17:45.132969 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:17:45.132938 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa20c93_8349_4462_afe5_350655a5e6bb.slice/crio-aacb84adb7c51e160f597d1b34338c15d3d1aa31910f0030ad970260797d8ac3 WatchSource:0}: Error finding container aacb84adb7c51e160f597d1b34338c15d3d1aa31910f0030ad970260797d8ac3: Status 404 returned error can't find the container with id aacb84adb7c51e160f597d1b34338c15d3d1aa31910f0030ad970260797d8ac3 Apr 22 14:17:45.187485 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:45.187457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-v7ql9" event={"ID":"9aa20c93-8349-4462-afe5-350655a5e6bb","Type":"ContainerStarted","Data":"aacb84adb7c51e160f597d1b34338c15d3d1aa31910f0030ad970260797d8ac3"} Apr 22 14:17:45.201566 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:45.201548 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vm6ts_b6c47887-6a82-4f97-8cf9-82dee1757b34/node-ca/0.log" Apr 22 14:17:45.834278 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:45.834250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:45.834445 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:45.834377 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:45.834445 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:45.834435 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls podName:84e5fe39-c177-4874-a08b-cec368549879 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:53.834419205 +0000 UTC m=+158.632181810 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fv6qj" (UID: "84e5fe39-c177-4874-a08b-cec368549879") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:45.935455 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:45.935423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:45.935632 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:45.935570 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:45.935707 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:45.935637 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls podName:070b7b4b-554f-4feb-b7a5-5e8185d64f96 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:53.935621345 +0000 UTC m=+158.733383949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fjgkj" (UID: "070b7b4b-554f-4feb-b7a5-5e8185d64f96") : secret "samples-operator-tls" not found Apr 22 14:17:46.191494 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:46.191405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-v7ql9" event={"ID":"9aa20c93-8349-4462-afe5-350655a5e6bb","Type":"ContainerStarted","Data":"0a1841514120762ecb552d94e137d12630f3a7d5e9acbefd0a296f9de9eae3ab"} Apr 22 14:17:46.224219 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:46.224169 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-v7ql9" podStartSLOduration=2.224154612 podStartE2EDuration="2.224154612s" podCreationTimestamp="2026-04-22 14:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:46.223640931 +0000 UTC m=+151.021403552" watchObservedRunningTime="2026-04-22 14:17:46.224154612 +0000 UTC m=+151.021917234" Apr 22 14:17:50.651699 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:50.651656 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" podUID="5e58ca55-5638-4845-9cb8-959ad4a0d61f" Apr 22 14:17:50.663854 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:50.663829 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-b7rqq" podUID="cab656ec-aafc-44d5-bcf4-998f7334f612" Apr 22 14:17:50.695119 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:50.695096 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kmxlt" podUID="6425daea-30dc-424d-a7ff-d63860240eee" Apr 22 14:17:50.784956 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:50.784923 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qh8tk" podUID="6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d" Apr 22 14:17:51.202351 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:51.202321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:17:51.202523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:51.202321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b7rqq" Apr 22 14:17:53.898826 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:53.898786 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:17:53.899189 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:53.898945 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:53.899189 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:17:53.899021 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls podName:84e5fe39-c177-4874-a08b-cec368549879 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:09.899005979 +0000 UTC m=+174.696768595 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fv6qj" (UID: "84e5fe39-c177-4874-a08b-cec368549879") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:53.999541 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:53.999501 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:54.001877 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:54.001860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/070b7b4b-554f-4feb-b7a5-5e8185d64f96-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fjgkj\" (UID: \"070b7b4b-554f-4feb-b7a5-5e8185d64f96\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:54.029759 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:54.029736 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" Apr 22 14:17:54.145913 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:54.145881 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj"] Apr 22 14:17:54.212491 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:54.212457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" event={"ID":"070b7b4b-554f-4feb-b7a5-5e8185d64f96","Type":"ContainerStarted","Data":"f425828cc535a06b5c2b936db40df1531ec178e71811a7b84e60be07154376a1"} Apr 22 14:17:55.613068 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.613028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:17:55.613068 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.613074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:17:55.613650 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.613183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:17:55.615914 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.615890 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cab656ec-aafc-44d5-bcf4-998f7334f612-metrics-tls\") pod \"dns-default-b7rqq\" (UID: \"cab656ec-aafc-44d5-bcf4-998f7334f612\") " pod="openshift-dns/dns-default-b7rqq" Apr 22 14:17:55.616156 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.616133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6425daea-30dc-424d-a7ff-d63860240eee-cert\") pod \"ingress-canary-kmxlt\" (UID: \"6425daea-30dc-424d-a7ff-d63860240eee\") " pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:17:55.616245 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.616224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"image-registry-5dbc4fdcb4-5wb6v\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:17:55.706227 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.706190 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpmwg\"" Apr 22 14:17:55.706412 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.706190 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-x72w4\"" Apr 22 14:17:55.713778 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.713752 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:17:55.713919 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:55.713810 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b7rqq" Apr 22 14:17:56.028266 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.028237 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b7rqq"] Apr 22 14:17:56.031616 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:17:56.031588 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcab656ec_aafc_44d5_bcf4_998f7334f612.slice/crio-5010f15e2c0082b5160979470d93889f33686c91c02086226748984f7c7ffa77 WatchSource:0}: Error finding container 5010f15e2c0082b5160979470d93889f33686c91c02086226748984f7c7ffa77: Status 404 returned error can't find the container with id 5010f15e2c0082b5160979470d93889f33686c91c02086226748984f7c7ffa77 Apr 22 14:17:56.048899 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.048872 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v"] Apr 22 14:17:56.051127 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:17:56.051099 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e58ca55_5638_4845_9cb8_959ad4a0d61f.slice/crio-096c6a003e92ea2264494d58cf3690cf06733d8b3cc304b4438206d04339cf75 WatchSource:0}: Error finding container 096c6a003e92ea2264494d58cf3690cf06733d8b3cc304b4438206d04339cf75: Status 404 returned error can't find the container with id 096c6a003e92ea2264494d58cf3690cf06733d8b3cc304b4438206d04339cf75 Apr 22 14:17:56.219961 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.219870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" event={"ID":"070b7b4b-554f-4feb-b7a5-5e8185d64f96","Type":"ContainerStarted","Data":"c723a65c235df0f6b9462704cdf0c2674732c2e420e0b467080caa4d97093f1e"} Apr 22 14:17:56.219961 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.219912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" event={"ID":"070b7b4b-554f-4feb-b7a5-5e8185d64f96","Type":"ContainerStarted","Data":"10e2e64c51138e92e82b592ca14b723e1d5de73f6143aea9aa7cbe1c0939e6b9"} Apr 22 14:17:56.221054 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.221026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b7rqq" event={"ID":"cab656ec-aafc-44d5-bcf4-998f7334f612","Type":"ContainerStarted","Data":"5010f15e2c0082b5160979470d93889f33686c91c02086226748984f7c7ffa77"} Apr 22 14:17:56.222455 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.222428 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" event={"ID":"5e58ca55-5638-4845-9cb8-959ad4a0d61f","Type":"ContainerStarted","Data":"c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b"} Apr 22 14:17:56.222455 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.222461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" event={"ID":"5e58ca55-5638-4845-9cb8-959ad4a0d61f","Type":"ContainerStarted","Data":"096c6a003e92ea2264494d58cf3690cf06733d8b3cc304b4438206d04339cf75"} Apr 22 14:17:56.222669 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.222565 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:17:56.238705 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.238653 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fjgkj" podStartSLOduration=16.461630637 podStartE2EDuration="18.238641064s" podCreationTimestamp="2026-04-22 14:17:38 +0000 UTC" firstStartedPulling="2026-04-22 14:17:54.188457035 +0000 UTC m=+158.986219634" lastFinishedPulling="2026-04-22 14:17:55.965467447 +0000 UTC m=+160.763230061" observedRunningTime="2026-04-22 14:17:56.237646284 +0000 UTC m=+161.035408905" watchObservedRunningTime="2026-04-22 14:17:56.238641064 +0000 UTC m=+161.036403685" Apr 22 14:17:56.255397 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:56.255358 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" podStartSLOduration=160.255346965 podStartE2EDuration="2m40.255346965s" podCreationTimestamp="2026-04-22 14:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:56.254967216 +0000 UTC m=+161.052729837" watchObservedRunningTime="2026-04-22 14:17:56.255346965 +0000 UTC m=+161.053109586" Apr 22 14:17:58.230670 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:58.230634 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b7rqq" event={"ID":"cab656ec-aafc-44d5-bcf4-998f7334f612","Type":"ContainerStarted","Data":"03a2643f63ba4e896aac96bfc77c4dd2cb580af73f204fd700347934f0e92cb8"} Apr 22 14:17:58.230670 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:58.230671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b7rqq" event={"ID":"cab656ec-aafc-44d5-bcf4-998f7334f612","Type":"ContainerStarted","Data":"75023e1bf9a686d16f591f2e0fc8ac6c142269cc0f0298803fe2d91fcb86796f"} Apr 22 14:17:58.231086 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:58.230798 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-b7rqq" Apr 22 14:17:58.260655 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:17:58.260607 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-b7rqq" podStartSLOduration=129.932807229 podStartE2EDuration="2m11.26059285s" podCreationTimestamp="2026-04-22 14:15:47 +0000 UTC" firstStartedPulling="2026-04-22 14:17:56.033720774 +0000 UTC m=+160.831483387" lastFinishedPulling="2026-04-22 14:17:57.361506405 +0000 UTC m=+162.159269008" observedRunningTime="2026-04-22 14:17:58.25955917 +0000 UTC m=+163.057321790" watchObservedRunningTime="2026-04-22 14:17:58.26059285 +0000 UTC m=+163.058355471" Apr 22 14:18:04.773559 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:04.773470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:18:04.775826 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:04.775738 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lthk8\"" Apr 22 14:18:04.784542 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:04.784526 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kmxlt" Apr 22 14:18:04.895369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:04.895343 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kmxlt"] Apr 22 14:18:04.898393 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:04.898351 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6425daea_30dc_424d_a7ff_d63860240eee.slice/crio-45a6b042f195144f7d11614ce69a81ae2836b22b2b3285f32069e13a18bb2b6f WatchSource:0}: Error finding container 45a6b042f195144f7d11614ce69a81ae2836b22b2b3285f32069e13a18bb2b6f: Status 404 returned error can't find the container with id 45a6b042f195144f7d11614ce69a81ae2836b22b2b3285f32069e13a18bb2b6f Apr 22 14:18:05.251207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:05.251131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kmxlt" event={"ID":"6425daea-30dc-424d-a7ff-d63860240eee","Type":"ContainerStarted","Data":"45a6b042f195144f7d11614ce69a81ae2836b22b2b3285f32069e13a18bb2b6f"} Apr 22 14:18:05.775879 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:05.775853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:18:07.257149 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.257116 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kmxlt" event={"ID":"6425daea-30dc-424d-a7ff-d63860240eee","Type":"ContainerStarted","Data":"0ce50f265fbf6965a0fec0106cbd2b56248034577480f18988b87413a872f977"} Apr 22 14:18:07.291656 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.291629 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c9nv9"] Apr 22 14:18:07.291908 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.291839 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kmxlt" podStartSLOduration=138.940476055 podStartE2EDuration="2m20.291826884s" podCreationTimestamp="2026-04-22 14:15:47 +0000 UTC" firstStartedPulling="2026-04-22 14:18:04.900008216 +0000 UTC m=+169.697770816" lastFinishedPulling="2026-04-22 14:18:06.251359035 +0000 UTC m=+171.049121645" observedRunningTime="2026-04-22 14:18:07.290074488 +0000 UTC m=+172.087837106" watchObservedRunningTime="2026-04-22 14:18:07.291826884 +0000 UTC m=+172.089589507" Apr 22 14:18:07.295368 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.295351 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.304203 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.304170 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:18:07.304299 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.304218 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:18:07.304502 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.304486 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zwk4n\"" Apr 22 14:18:07.308034 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.308014 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-x676c"] Apr 22 14:18:07.310852 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.310836 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-x676c" Apr 22 14:18:07.313273 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:07.313242 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-136-45.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'ip-10-0-136-45.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 22 14:18:07.313372 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:07.313242 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"default-dockercfg-lddkq\" is forbidden: User \"system:node:ip-10-0-136-45.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'ip-10-0-136-45.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console\"/\"default-dockercfg-lddkq\"" type="*v1.Secret" Apr 22 14:18:07.316930 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.316908 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c9nv9"] Apr 22 14:18:07.317940 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.317921 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 14:18:07.346209 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.346184 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-x676c"] Apr 22 14:18:07.400652 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.400585 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmkf\" (UniqueName: \"kubernetes.io/projected/0d4a5b6f-e316-4ade-9baf-933024dc955e-kube-api-access-4zmkf\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.400652 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.400633 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0d4a5b6f-e316-4ade-9baf-933024dc955e-crio-socket\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.400870 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.400709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdx4\" (UniqueName: \"kubernetes.io/projected/cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c-kube-api-access-9bdx4\") pod \"downloads-6bcc868b7-x676c\" (UID: \"cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c\") " pod="openshift-console/downloads-6bcc868b7-x676c" Apr 22 14:18:07.400870 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.400815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0d4a5b6f-e316-4ade-9baf-933024dc955e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.400870 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.400852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0d4a5b6f-e316-4ade-9baf-933024dc955e-data-volume\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.400988 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.400871 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0d4a5b6f-e316-4ade-9baf-933024dc955e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.501543 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.501509 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0d4a5b6f-e316-4ade-9baf-933024dc955e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.501676 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.501560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0d4a5b6f-e316-4ade-9baf-933024dc955e-data-volume\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.501676 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.501583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0d4a5b6f-e316-4ade-9baf-933024dc955e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.501676 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.501618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmkf\" (UniqueName: \"kubernetes.io/projected/0d4a5b6f-e316-4ade-9baf-933024dc955e-kube-api-access-4zmkf\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.501676 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.501645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0d4a5b6f-e316-4ade-9baf-933024dc955e-crio-socket\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.501902 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.501711 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdx4\" (UniqueName: \"kubernetes.io/projected/cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c-kube-api-access-9bdx4\") pod \"downloads-6bcc868b7-x676c\" (UID: \"cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c\") " pod="openshift-console/downloads-6bcc868b7-x676c" Apr 22 14:18:07.501902 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.501774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0d4a5b6f-e316-4ade-9baf-933024dc955e-crio-socket\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.501902 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.501883 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0d4a5b6f-e316-4ade-9baf-933024dc955e-data-volume\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.502086 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.502067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0d4a5b6f-e316-4ade-9baf-933024dc955e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.503868 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.503850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0d4a5b6f-e316-4ade-9baf-933024dc955e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.511591 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.511540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmkf\" (UniqueName: \"kubernetes.io/projected/0d4a5b6f-e316-4ade-9baf-933024dc955e-kube-api-access-4zmkf\") pod \"insights-runtime-extractor-c9nv9\" (UID: \"0d4a5b6f-e316-4ade-9baf-933024dc955e\") " pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.603475 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.603456 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c9nv9" Apr 22 14:18:07.722025 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:07.721998 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c9nv9"] Apr 22 14:18:07.725058 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:07.725027 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d4a5b6f_e316_4ade_9baf_933024dc955e.slice/crio-6f0feaf45fb540d44d3d621125b04ebd07af242f1eae6a5974470035826be38e WatchSource:0}: Error finding container 6f0feaf45fb540d44d3d621125b04ebd07af242f1eae6a5974470035826be38e: Status 404 returned error can't find the container with id 6f0feaf45fb540d44d3d621125b04ebd07af242f1eae6a5974470035826be38e Apr 22 14:18:08.199493 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.199462 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5768447b96-9pmff"] Apr 22 14:18:08.202499 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.202483 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.204947 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.204921 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 14:18:08.204947 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.204940 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xv94w\"" Apr 22 14:18:08.205632 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.205613 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 14:18:08.205759 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.205640 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 14:18:08.205759 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.205653 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 14:18:08.205759 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.205613 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 14:18:08.214235 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.214215 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5768447b96-9pmff"] Apr 22 14:18:08.237167 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.237144 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-b7rqq" Apr 22 14:18:08.261032 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.260998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9nv9" event={"ID":"0d4a5b6f-e316-4ade-9baf-933024dc955e","Type":"ContainerStarted","Data":"40757ef92819413fb679e83bc1f73d8bf8e79b05a4a2b68a1985e35dd12ecaa6"} Apr 22 14:18:08.261387 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.261036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9nv9" event={"ID":"0d4a5b6f-e316-4ade-9baf-933024dc955e","Type":"ContainerStarted","Data":"6f0feaf45fb540d44d3d621125b04ebd07af242f1eae6a5974470035826be38e"} Apr 22 14:18:08.307346 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.307315 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-oauth-config\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.307473 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.307404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-serving-cert\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.307473 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.307451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmfj\" (UniqueName: \"kubernetes.io/projected/57870da4-8b9b-45d3-a028-0ec65ce502ca-kube-api-access-fvmfj\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.307566 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.307477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-service-ca\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.307613 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.307590 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-config\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.307714 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.307633 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-oauth-serving-cert\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.332965 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.332941 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-lddkq\"" Apr 22 14:18:08.408305 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.408275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-oauth-config\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.408430 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.408330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-serving-cert\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.408430 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.408360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmfj\" (UniqueName: \"kubernetes.io/projected/57870da4-8b9b-45d3-a028-0ec65ce502ca-kube-api-access-fvmfj\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.408430 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.408385 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-service-ca\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.408532 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.408438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-config\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.408532 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.408470 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-oauth-serving-cert\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.409193 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.409167 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-service-ca\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.409326 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.409173 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-config\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.409326 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.409211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-oauth-serving-cert\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.410572 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.410551 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-serving-cert\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.410798 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.410783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-oauth-config\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.509846 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:08.509818 2573 projected.go:289] Couldn't get configMap openshift-console/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 22 14:18:08.509949 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:08.509860 2573 projected.go:194] Error preparing data for projected volume kube-api-access-9bdx4 for pod openshift-console/downloads-6bcc868b7-x676c: failed to sync configmap cache: timed out waiting for the condition Apr 22 14:18:08.509949 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:08.509916 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c-kube-api-access-9bdx4 podName:cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c nodeName:}" failed. No retries permitted until 2026-04-22 14:18:09.009897478 +0000 UTC m=+173.807660091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9bdx4" (UniqueName: "kubernetes.io/projected/cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c-kube-api-access-9bdx4") pod "downloads-6bcc868b7-x676c" (UID: "cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c") : failed to sync configmap cache: timed out waiting for the condition Apr 22 14:18:08.555520 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.555491 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 14:18:08.563794 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.563769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmfj\" (UniqueName: \"kubernetes.io/projected/57870da4-8b9b-45d3-a028-0ec65ce502ca-kube-api-access-fvmfj\") pod \"console-5768447b96-9pmff\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.811456 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.811427 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:08.938381 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:08.938351 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5768447b96-9pmff"] Apr 22 14:18:08.941361 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:08.941334 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57870da4_8b9b_45d3_a028_0ec65ce502ca.slice/crio-01dbf7f11a018f4711fdc5586241a010c97c332cd7d0eb1e3d55adee1c0fabbc WatchSource:0}: Error finding container 01dbf7f11a018f4711fdc5586241a010c97c332cd7d0eb1e3d55adee1c0fabbc: Status 404 returned error can't find the container with id 01dbf7f11a018f4711fdc5586241a010c97c332cd7d0eb1e3d55adee1c0fabbc Apr 22 14:18:09.014265 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.014226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdx4\" (UniqueName: \"kubernetes.io/projected/cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c-kube-api-access-9bdx4\") pod \"downloads-6bcc868b7-x676c\" (UID: \"cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c\") " pod="openshift-console/downloads-6bcc868b7-x676c" Apr 22 14:18:09.016994 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.016974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdx4\" (UniqueName: \"kubernetes.io/projected/cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c-kube-api-access-9bdx4\") pod \"downloads-6bcc868b7-x676c\" (UID: \"cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c\") " pod="openshift-console/downloads-6bcc868b7-x676c" Apr 22 14:18:09.120410 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.120339 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-x676c" Apr 22 14:18:09.247809 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.247780 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-x676c"] Apr 22 14:18:09.250769 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:09.250743 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf11cd5f_5f43_4b8f_9de3_5d91eabe9b5c.slice/crio-7d16c6f5c9ac0581f95154398ff47d1eb0e2bc322781c5232ba0f427de63a276 WatchSource:0}: Error finding container 7d16c6f5c9ac0581f95154398ff47d1eb0e2bc322781c5232ba0f427de63a276: Status 404 returned error can't find the container with id 7d16c6f5c9ac0581f95154398ff47d1eb0e2bc322781c5232ba0f427de63a276 Apr 22 14:18:09.267206 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.267170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5768447b96-9pmff" event={"ID":"57870da4-8b9b-45d3-a028-0ec65ce502ca","Type":"ContainerStarted","Data":"01dbf7f11a018f4711fdc5586241a010c97c332cd7d0eb1e3d55adee1c0fabbc"} Apr 22 14:18:09.268908 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.268884 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9nv9" event={"ID":"0d4a5b6f-e316-4ade-9baf-933024dc955e","Type":"ContainerStarted","Data":"38f9e947652cdff23c1002bfe8373e233aa4af17f65110fa5565bcf591c78543"} Apr 22 14:18:09.269894 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.269868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-x676c" event={"ID":"cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c","Type":"ContainerStarted","Data":"7d16c6f5c9ac0581f95154398ff47d1eb0e2bc322781c5232ba0f427de63a276"} Apr 22 14:18:09.923107 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.923062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:18:09.925314 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:09.925279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84e5fe39-c177-4874-a08b-cec368549879-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fv6qj\" (UID: \"84e5fe39-c177-4874-a08b-cec368549879\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:18:10.114441 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:10.114352 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" Apr 22 14:18:10.244805 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:10.244751 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj"] Apr 22 14:18:10.249402 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:10.249361 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84e5fe39_c177_4874_a08b_cec368549879.slice/crio-f05728b268b26da5d870d656cfba0b7ba6da060e66db93eb899003010bfa255b WatchSource:0}: Error finding container f05728b268b26da5d870d656cfba0b7ba6da060e66db93eb899003010bfa255b: Status 404 returned error can't find the container with id f05728b268b26da5d870d656cfba0b7ba6da060e66db93eb899003010bfa255b Apr 22 14:18:10.281734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:10.281681 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9nv9" event={"ID":"0d4a5b6f-e316-4ade-9baf-933024dc955e","Type":"ContainerStarted","Data":"2df1880c8c68ddac6f7aba33eff0bf3313a0a9de0f58b24cb45467dfc28c4c8e"} Apr 22 14:18:10.283383 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:10.283357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" event={"ID":"84e5fe39-c177-4874-a08b-cec368549879","Type":"ContainerStarted","Data":"f05728b268b26da5d870d656cfba0b7ba6da060e66db93eb899003010bfa255b"} Apr 22 14:18:10.300570 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:10.300516 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c9nv9" podStartSLOduration=1.273967382 podStartE2EDuration="3.300497384s" podCreationTimestamp="2026-04-22 14:18:07 +0000 UTC" firstStartedPulling="2026-04-22 14:18:07.780503685 +0000 UTC m=+172.578266285" lastFinishedPulling="2026-04-22 14:18:09.807033683 +0000 UTC m=+174.604796287" observedRunningTime="2026-04-22 14:18:10.299984395 +0000 UTC m=+175.097747016" watchObservedRunningTime="2026-04-22 14:18:10.300497384 +0000 UTC m=+175.098260007" Apr 22 14:18:14.018205 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.018173 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km"] Apr 22 14:18:14.021489 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.021467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" Apr 22 14:18:14.024602 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.024573 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 14:18:14.024743 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.024618 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-kqs2f\"" Apr 22 14:18:14.036947 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.036916 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km"] Apr 22 14:18:14.159505 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.159451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/968a0cf8-f5d3-4918-9501-d37c536bbccf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2l4km\" (UID: \"968a0cf8-f5d3-4918-9501-d37c536bbccf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" Apr 22 14:18:14.260707 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.260655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/968a0cf8-f5d3-4918-9501-d37c536bbccf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2l4km\" (UID: \"968a0cf8-f5d3-4918-9501-d37c536bbccf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" Apr 22 14:18:14.260859 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:14.260821 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 14:18:14.260904 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:14.260889 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/968a0cf8-f5d3-4918-9501-d37c536bbccf-tls-certificates podName:968a0cf8-f5d3-4918-9501-d37c536bbccf nodeName:}" failed. No retries permitted until 2026-04-22 14:18:14.760871735 +0000 UTC m=+179.558634345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/968a0cf8-f5d3-4918-9501-d37c536bbccf-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-2l4km" (UID: "968a0cf8-f5d3-4918-9501-d37c536bbccf") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 14:18:14.297616 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.297532 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5768447b96-9pmff" event={"ID":"57870da4-8b9b-45d3-a028-0ec65ce502ca","Type":"ContainerStarted","Data":"38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71"} Apr 22 14:18:14.298991 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.298964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" event={"ID":"84e5fe39-c177-4874-a08b-cec368549879","Type":"ContainerStarted","Data":"92d8491c67cd22b70c133a549e348541901aa63a83d3c790e2131b00582d00fc"} Apr 22 14:18:14.316326 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.316284 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5768447b96-9pmff" podStartSLOduration=1.854479652 podStartE2EDuration="6.316271134s" podCreationTimestamp="2026-04-22 14:18:08 +0000 UTC" firstStartedPulling="2026-04-22 14:18:08.94321549 +0000 UTC m=+173.740978089" lastFinishedPulling="2026-04-22 14:18:13.405006967 +0000 UTC m=+178.202769571" observedRunningTime="2026-04-22 14:18:14.315335098 +0000 UTC m=+179.113097743" watchObservedRunningTime="2026-04-22 14:18:14.316271134 +0000 UTC m=+179.114033756" Apr 22 14:18:14.333534 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.333489 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fv6qj" podStartSLOduration=34.182098497 podStartE2EDuration="37.3334759s" podCreationTimestamp="2026-04-22 14:17:37 +0000 UTC" firstStartedPulling="2026-04-22 14:18:10.251769366 +0000 UTC m=+175.049531976" lastFinishedPulling="2026-04-22 14:18:13.403146777 +0000 UTC m=+178.200909379" observedRunningTime="2026-04-22 14:18:14.333238202 +0000 UTC m=+179.131000865" watchObservedRunningTime="2026-04-22 14:18:14.3334759 +0000 UTC m=+179.131238522" Apr 22 14:18:14.766222 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.766180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/968a0cf8-f5d3-4918-9501-d37c536bbccf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2l4km\" (UID: \"968a0cf8-f5d3-4918-9501-d37c536bbccf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" Apr 22 14:18:14.775210 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.775183 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/968a0cf8-f5d3-4918-9501-d37c536bbccf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2l4km\" (UID: \"968a0cf8-f5d3-4918-9501-d37c536bbccf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" Apr 22 14:18:14.933306 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:14.933255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" Apr 22 14:18:15.072824 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:15.072788 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km"] Apr 22 14:18:15.075892 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:15.075866 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod968a0cf8_f5d3_4918_9501_d37c536bbccf.slice/crio-a9131f26c3ab735eb8b4e806373a7c7635d3b3ae3454e26deaef11f2a6ca7e51 WatchSource:0}: Error finding container a9131f26c3ab735eb8b4e806373a7c7635d3b3ae3454e26deaef11f2a6ca7e51: Status 404 returned error can't find the container with id a9131f26c3ab735eb8b4e806373a7c7635d3b3ae3454e26deaef11f2a6ca7e51 Apr 22 14:18:15.302546 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:15.302461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" event={"ID":"968a0cf8-f5d3-4918-9501-d37c536bbccf","Type":"ContainerStarted","Data":"a9131f26c3ab735eb8b4e806373a7c7635d3b3ae3454e26deaef11f2a6ca7e51"} Apr 22 14:18:15.719668 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:15.719584 2573 patch_prober.go:28] interesting pod/image-registry-5dbc4fdcb4-5wb6v container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 14:18:15.719965 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:15.719645 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" podUID="5e58ca55-5638-4845-9cb8-959ad4a0d61f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:18:17.231312 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:17.231278 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:18:17.313465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:17.313428 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" event={"ID":"968a0cf8-f5d3-4918-9501-d37c536bbccf","Type":"ContainerStarted","Data":"0abc7568f57485b85bf869df45d21ed7d70bc48816ba53bb9ed80e172e82c138"} Apr 22 14:18:17.313815 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:17.313777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" Apr 22 14:18:17.319069 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:17.319047 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" Apr 22 14:18:17.330410 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:17.330357 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2l4km" podStartSLOduration=3.164415196 podStartE2EDuration="4.330344075s" podCreationTimestamp="2026-04-22 14:18:13 +0000 UTC" firstStartedPulling="2026-04-22 14:18:15.078335289 +0000 UTC m=+179.876097892" lastFinishedPulling="2026-04-22 14:18:16.244264159 +0000 UTC m=+181.042026771" observedRunningTime="2026-04-22 14:18:17.328915749 +0000 UTC m=+182.126678373" watchObservedRunningTime="2026-04-22 14:18:17.330344075 +0000 UTC m=+182.128106697" Apr 22 14:18:18.067657 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.067618 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vc7fj"] Apr 22 14:18:18.071274 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.071251 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.074408 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.074382 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-z2pr7\"" Apr 22 14:18:18.074517 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.074428 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:18:18.074654 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.074640 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 14:18:18.074765 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.074651 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 14:18:18.086183 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.086160 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vc7fj"] Apr 22 14:18:18.198453 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.198416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.198619 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.198502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbjk\" (UniqueName: \"kubernetes.io/projected/37142c62-18c9-4bbe-b554-97c41a82c03e-kube-api-access-8vbjk\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.198619 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.198532 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.198619 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.198561 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37142c62-18c9-4bbe-b554-97c41a82c03e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.299807 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.299770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.300317 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.299876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbjk\" (UniqueName: \"kubernetes.io/projected/37142c62-18c9-4bbe-b554-97c41a82c03e-kube-api-access-8vbjk\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.300317 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.299908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.300317 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.299945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37142c62-18c9-4bbe-b554-97c41a82c03e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.300317 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:18.300245 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 14:18:18.300569 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:18.300323 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-tls podName:37142c62-18c9-4bbe-b554-97c41a82c03e nodeName:}" failed. No retries permitted until 2026-04-22 14:18:18.800303101 +0000 UTC m=+183.598065720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-vc7fj" (UID: "37142c62-18c9-4bbe-b554-97c41a82c03e") : secret "prometheus-operator-tls" not found Apr 22 14:18:18.300759 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.300734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37142c62-18c9-4bbe-b554-97c41a82c03e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.302605 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.302581 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.312183 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.312134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbjk\" (UniqueName: \"kubernetes.io/projected/37142c62-18c9-4bbe-b554-97c41a82c03e-kube-api-access-8vbjk\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.803071 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.803025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.805414 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.805388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/37142c62-18c9-4bbe-b554-97c41a82c03e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vc7fj\" (UID: \"37142c62-18c9-4bbe-b554-97c41a82c03e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:18.812158 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.812137 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:18.812407 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.812386 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:18.818101 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.818083 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:18.982284 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:18.982255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" Apr 22 14:18:19.323887 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:19.323856 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:25.575437 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:25.575410 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vc7fj"] Apr 22 14:18:25.578260 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:25.578226 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37142c62_18c9_4bbe_b554_97c41a82c03e.slice/crio-359d0a8b00542505ea152c43c590fa8726b0eff1df6c6a247983759edb99e973 WatchSource:0}: Error finding container 359d0a8b00542505ea152c43c590fa8726b0eff1df6c6a247983759edb99e973: Status 404 returned error can't find the container with id 359d0a8b00542505ea152c43c590fa8726b0eff1df6c6a247983759edb99e973 Apr 22 14:18:26.341421 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:26.341340 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-x676c" event={"ID":"cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c","Type":"ContainerStarted","Data":"596d9409f0f4687d1f1e92bdedd4e7191a173693f4ca93667dc15bc1c1c6cc85"} Apr 22 14:18:26.344074 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:26.344050 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-x676c" Apr 22 14:18:26.345338 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:26.345296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" event={"ID":"37142c62-18c9-4bbe-b554-97c41a82c03e","Type":"ContainerStarted","Data":"359d0a8b00542505ea152c43c590fa8726b0eff1df6c6a247983759edb99e973"} Apr 22 14:18:26.360042 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:26.359992 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-x676c" podStartSLOduration=3.074199697 podStartE2EDuration="19.359976692s" podCreationTimestamp="2026-04-22 14:18:07 +0000 UTC" firstStartedPulling="2026-04-22 14:18:09.252881104 +0000 UTC m=+174.050643708" lastFinishedPulling="2026-04-22 14:18:25.538658095 +0000 UTC m=+190.336420703" observedRunningTime="2026-04-22 14:18:26.359956215 +0000 UTC m=+191.157718830" watchObservedRunningTime="2026-04-22 14:18:26.359976692 +0000 UTC m=+191.157739314" Apr 22 14:18:26.361145 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:26.361123 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-x676c" Apr 22 14:18:27.350284 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:27.350241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" event={"ID":"37142c62-18c9-4bbe-b554-97c41a82c03e","Type":"ContainerStarted","Data":"e4e43b253dcd2d2bb6a4e8e5562820d22be342252c3c30e5ced4d36b83a4736e"} Apr 22 14:18:28.354814 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:28.354764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" event={"ID":"37142c62-18c9-4bbe-b554-97c41a82c03e","Type":"ContainerStarted","Data":"aa7ff51fc94bc6256bdee3538a6c5b103c2d55bafe40d4942cbda9df0a4ac3a9"} Apr 22 14:18:28.381119 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:28.381072 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc7fj" podStartSLOduration=8.795729271999999 podStartE2EDuration="10.38105735s" podCreationTimestamp="2026-04-22 14:18:18 +0000 UTC" firstStartedPulling="2026-04-22 14:18:25.580184523 +0000 UTC m=+190.377947123" lastFinishedPulling="2026-04-22 14:18:27.16551259 +0000 UTC m=+191.963275201" observedRunningTime="2026-04-22 14:18:28.38021946 +0000 UTC m=+193.177982104" watchObservedRunningTime="2026-04-22 14:18:28.38105735 +0000 UTC m=+193.178819973" Apr 22 14:18:29.543478 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:29.543447 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5768447b96-9pmff"] Apr 22 14:18:29.830256 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:29.830182 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v"] Apr 22 14:18:30.512394 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.512346 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v7xr8"] Apr 22 14:18:30.590245 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.590207 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v7xr8"] Apr 22 14:18:30.590718 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.590390 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.590809 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.590702 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kngm6"] Apr 22 14:18:30.594827 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.594307 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 14:18:30.594827 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.594434 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 14:18:30.594827 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.594523 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 14:18:30.594827 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.594725 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ctbsg\"" Apr 22 14:18:30.622169 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.622146 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.624541 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.624517 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:18:30.625050 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.624738 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zgcjt\"" Apr 22 14:18:30.625050 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.624918 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:18:30.625207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.625145 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:18:30.715459 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.715422 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.715629 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.715491 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.715629 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.715584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82bsk\" (UniqueName: \"kubernetes.io/projected/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-api-access-82bsk\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.715760 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.715619 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.715760 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.715742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.715854 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.715784 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.816619 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.816785 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-sys\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.816785 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.816785 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-textfile\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.816785 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816729 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-wtmp\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.817010 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816878 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.817010 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82bsk\" (UniqueName: \"kubernetes.io/projected/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-api-access-82bsk\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.817010 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-metrics-client-ca\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.817010 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.816977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.817207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.817013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.817207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.817029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-root\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.817207 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:30.817032 2573 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 14:18:30.817207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.817045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-accelerators-collector-config\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.817207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.817063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccf4\" (UniqueName: \"kubernetes.io/projected/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-kube-api-access-5ccf4\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.817207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.817084 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.817207 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:30.817117 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-tls podName:c820e8c6-75f5-4e79-9f8c-04e662cce3e8 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:31.317096399 +0000 UTC m=+196.114859017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-v7xr8" (UID: "c820e8c6-75f5-4e79-9f8c-04e662cce3e8") : secret "kube-state-metrics-tls" not found Apr 22 14:18:30.817207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.817155 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-tls\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.818301 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.818278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.818613 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.818591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.819236 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.819220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.819890 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.819868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.828100 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.828071 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82bsk\" (UniqueName: \"kubernetes.io/projected/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-api-access-82bsk\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:30.917788 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.917760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-metrics-client-ca\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918004 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.917984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-root\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918099 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-accelerators-collector-config\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918159 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-root\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918159 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccf4\" (UniqueName: \"kubernetes.io/projected/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-kube-api-access-5ccf4\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918258 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-tls\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918258 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918258 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-sys\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918402 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-textfile\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918402 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-wtmp\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918508 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-wtmp\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918508 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918487 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-metrics-client-ca\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918603 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918529 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-accelerators-collector-config\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.918807 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.918787 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-sys\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.919057 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.919018 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-textfile\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.921877 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.921850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.921988 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.921853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-node-exporter-tls\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.927557 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.927535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccf4\" (UniqueName: \"kubernetes.io/projected/cfe6ba1b-0e9f-4a07-ae97-e903f41c1194-kube-api-access-5ccf4\") pod \"node-exporter-kngm6\" (UID: \"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194\") " pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:30.933882 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:30.933862 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kngm6" Apr 22 14:18:31.322633 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.322601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:31.325332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.325288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c820e8c6-75f5-4e79-9f8c-04e662cce3e8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v7xr8\" (UID: \"c820e8c6-75f5-4e79-9f8c-04e662cce3e8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:31.366071 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.366036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kngm6" event={"ID":"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194","Type":"ContainerStarted","Data":"4f3023960a21e94dfa7b9b83afe87e78cfc26cc2985a4e8d3ce2e02f34e6ef36"} Apr 22 14:18:31.503126 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.503071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" Apr 22 14:18:31.559442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.559410 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:18:31.580981 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.579564 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.580981 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.580595 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:18:31.588208 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.588183 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 14:18:31.588511 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.588460 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-pcdtd\"" Apr 22 14:18:31.588746 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.588727 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 14:18:31.594332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.593491 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 14:18:31.594332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.593733 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 14:18:31.594332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.593771 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 14:18:31.594332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.593956 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 14:18:31.594332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.593961 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 14:18:31.594332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.594127 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 14:18:31.594332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.594197 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 14:18:31.683338 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.683186 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v7xr8"] Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730103 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4413c3ef-9f87-47de-bc69-50c496ba4b87-config-out\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-config-volume\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730178 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730205 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-web-config\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4413c3ef-9f87-47de-bc69-50c496ba4b87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730399 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4413c3ef-9f87-47de-bc69-50c496ba4b87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-429qj\" (UniqueName: \"kubernetes.io/projected/4413c3ef-9f87-47de-bc69-50c496ba4b87-kube-api-access-429qj\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730542 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4413c3ef-9f87-47de-bc69-50c496ba4b87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.732703 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.730621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4413c3ef-9f87-47de-bc69-50c496ba4b87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.831454 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.831390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4413c3ef-9f87-47de-bc69-50c496ba4b87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.831454 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.831435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.831734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.831474 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4413c3ef-9f87-47de-bc69-50c496ba4b87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4413c3ef-9f87-47de-bc69-50c496ba4b87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4413c3ef-9f87-47de-bc69-50c496ba4b87-config-out\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-config-volume\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-web-config\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4413c3ef-9f87-47de-bc69-50c496ba4b87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832493 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4413c3ef-9f87-47de-bc69-50c496ba4b87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.833275 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.832647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-429qj\" (UniqueName: \"kubernetes.io/projected/4413c3ef-9f87-47de-bc69-50c496ba4b87-kube-api-access-429qj\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.834649 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.834016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4413c3ef-9f87-47de-bc69-50c496ba4b87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.834649 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.834235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4413c3ef-9f87-47de-bc69-50c496ba4b87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.834649 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.834548 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4413c3ef-9f87-47de-bc69-50c496ba4b87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.838102 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.837968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4413c3ef-9f87-47de-bc69-50c496ba4b87-config-out\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.838246 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.838223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-config-volume\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.838487 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.838435 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.847739 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.847719 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.847864 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.847847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.847914 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.847885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-web-config\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.847963 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.847927 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.848330 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.848310 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4413c3ef-9f87-47de-bc69-50c496ba4b87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.850336 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.850306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-429qj\" (UniqueName: \"kubernetes.io/projected/4413c3ef-9f87-47de-bc69-50c496ba4b87-kube-api-access-429qj\") pod \"alertmanager-main-0\" (UID: \"4413c3ef-9f87-47de-bc69-50c496ba4b87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:31.875458 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:31.875434 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc820e8c6_75f5_4e79_9f8c_04e662cce3e8.slice/crio-3cde1722b059feb92039627890062db77ad71ceb371214157137db35cd49bd15 WatchSource:0}: Error finding container 3cde1722b059feb92039627890062db77ad71ceb371214157137db35cd49bd15: Status 404 returned error can't find the container with id 3cde1722b059feb92039627890062db77ad71ceb371214157137db35cd49bd15 Apr 22 14:18:31.899198 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:31.899176 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:18:32.045965 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.045389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:18:32.151929 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:32.151842 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4413c3ef_9f87_47de_bc69_50c496ba4b87.slice/crio-4a2ecb6e1bf098480da1dd7cd48d4d305a22d309cc0570401d7fcf81d0137eaf WatchSource:0}: Error finding container 4a2ecb6e1bf098480da1dd7cd48d4d305a22d309cc0570401d7fcf81d0137eaf: Status 404 returned error can't find the container with id 4a2ecb6e1bf098480da1dd7cd48d4d305a22d309cc0570401d7fcf81d0137eaf Apr 22 14:18:32.371929 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.371893 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kngm6" event={"ID":"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194","Type":"ContainerStarted","Data":"452aada6968e2ea0936dda51c8744194207618c66081bdad7df3cd0531e57d85"} Apr 22 14:18:32.374315 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.374282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4413c3ef-9f87-47de-bc69-50c496ba4b87","Type":"ContainerStarted","Data":"4a2ecb6e1bf098480da1dd7cd48d4d305a22d309cc0570401d7fcf81d0137eaf"} Apr 22 14:18:32.375836 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.375791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" event={"ID":"c820e8c6-75f5-4e79-9f8c-04e662cce3e8","Type":"ContainerStarted","Data":"3cde1722b059feb92039627890062db77ad71ceb371214157137db35cd49bd15"} Apr 22 14:18:32.621178 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.621031 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-bd7945d5d-b2m8s"] Apr 22 14:18:32.641941 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.641044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.646734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.644497 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-bd7945d5d-b2m8s"] Apr 22 14:18:32.646734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.645084 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 14:18:32.646734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.645297 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-g22bp\"" Apr 22 14:18:32.646734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.645601 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 14:18:32.646734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.645856 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 14:18:32.646734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.646044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 14:18:32.646734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.646276 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5nna425ls9ajd\"" Apr 22 14:18:32.646734 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.646485 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 14:18:32.742705 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.742515 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.742705 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.742560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd638ff7-6608-49c3-8b03-48587688258d-metrics-client-ca\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.742705 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.742584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q67q\" (UniqueName: \"kubernetes.io/projected/bd638ff7-6608-49c3-8b03-48587688258d-kube-api-access-7q67q\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.742705 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.742605 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.742705 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.742674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.743085 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.742741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.743085 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.742761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-tls\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.743085 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.742803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-grpc-tls\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.845139 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.844333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.845139 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.844379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd638ff7-6608-49c3-8b03-48587688258d-metrics-client-ca\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.845139 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.844405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q67q\" (UniqueName: \"kubernetes.io/projected/bd638ff7-6608-49c3-8b03-48587688258d-kube-api-access-7q67q\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.845139 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.844436 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.845139 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.844486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.845139 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.844548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.845139 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.844577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-tls\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.845139 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.844628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-grpc-tls\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.847191 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.847163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd638ff7-6608-49c3-8b03-48587688258d-metrics-client-ca\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.849501 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.849452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-tls\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.849607 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.849528 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.851067 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.851043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.852811 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.852760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.853856 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.853815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-grpc-tls\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.855532 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.855486 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bd638ff7-6608-49c3-8b03-48587688258d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.859726 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.859674 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q67q\" (UniqueName: \"kubernetes.io/projected/bd638ff7-6608-49c3-8b03-48587688258d-kube-api-access-7q67q\") pod \"thanos-querier-bd7945d5d-b2m8s\" (UID: \"bd638ff7-6608-49c3-8b03-48587688258d\") " pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:32.964742 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:32.964658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:33.379974 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:33.379936 2573 generic.go:358] "Generic (PLEG): container finished" podID="cfe6ba1b-0e9f-4a07-ae97-e903f41c1194" containerID="452aada6968e2ea0936dda51c8744194207618c66081bdad7df3cd0531e57d85" exitCode=0 Apr 22 14:18:33.380276 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:33.380086 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kngm6" event={"ID":"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194","Type":"ContainerDied","Data":"452aada6968e2ea0936dda51c8744194207618c66081bdad7df3cd0531e57d85"} Apr 22 14:18:34.363793 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.363669 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-bd7945d5d-b2m8s"] Apr 22 14:18:34.366063 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:34.366028 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd638ff7_6608_49c3_8b03_48587688258d.slice/crio-108818a6f31722791ef40730913a17fbd3212cbc2bb8fa94a257b44fb38cd560 WatchSource:0}: Error finding container 108818a6f31722791ef40730913a17fbd3212cbc2bb8fa94a257b44fb38cd560: Status 404 returned error can't find the container with id 108818a6f31722791ef40730913a17fbd3212cbc2bb8fa94a257b44fb38cd560 Apr 22 14:18:34.384284 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.384235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" event={"ID":"bd638ff7-6608-49c3-8b03-48587688258d","Type":"ContainerStarted","Data":"108818a6f31722791ef40730913a17fbd3212cbc2bb8fa94a257b44fb38cd560"} Apr 22 14:18:34.386165 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.386138 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kngm6" event={"ID":"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194","Type":"ContainerStarted","Data":"1b48902480a9c34a0b1592a7b3c240000758129b9e8ffc6fa430b288c86a1f25"} Apr 22 14:18:34.387640 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.387615 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" event={"ID":"c820e8c6-75f5-4e79-9f8c-04e662cce3e8","Type":"ContainerStarted","Data":"5b7804fa889d364550cb24590259cf2d11021b99e0509dd12ab186011b94c457"} Apr 22 14:18:34.877320 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.877283 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6"] Apr 22 14:18:34.896150 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.896101 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6"] Apr 22 14:18:34.896315 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.896254 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:34.899173 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.899150 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 14:18:34.899339 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.899322 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 14:18:34.899429 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.899341 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-229bj\"" Apr 22 14:18:34.899429 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.899353 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 14:18:34.899536 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.899324 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1juh1ebhrndvl\"" Apr 22 14:18:34.899625 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.899609 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 14:18:34.966222 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.966188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-secret-metrics-server-client-certs\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:34.966417 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.966313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/df709a0c-327c-4e03-976f-3eae1b7859fa-metrics-server-audit-profiles\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:34.966417 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.966355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-secret-metrics-server-tls\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:34.966417 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.966393 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-client-ca-bundle\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:34.966574 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.966497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsz6t\" (UniqueName: \"kubernetes.io/projected/df709a0c-327c-4e03-976f-3eae1b7859fa-kube-api-access-dsz6t\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:34.966626 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.966570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/df709a0c-327c-4e03-976f-3eae1b7859fa-audit-log\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:34.966669 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:34.966634 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df709a0c-327c-4e03-976f-3eae1b7859fa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.067933 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.067677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/df709a0c-327c-4e03-976f-3eae1b7859fa-audit-log\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.067933 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.067759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df709a0c-327c-4e03-976f-3eae1b7859fa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.067933 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.067797 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-secret-metrics-server-client-certs\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.068240 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.068174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/df709a0c-327c-4e03-976f-3eae1b7859fa-audit-log\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.068584 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.068316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/df709a0c-327c-4e03-976f-3eae1b7859fa-metrics-server-audit-profiles\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.068584 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.068429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-secret-metrics-server-tls\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.068584 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.068450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-client-ca-bundle\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.068796 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.068706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsz6t\" (UniqueName: \"kubernetes.io/projected/df709a0c-327c-4e03-976f-3eae1b7859fa-kube-api-access-dsz6t\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.069332 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.069296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/df709a0c-327c-4e03-976f-3eae1b7859fa-metrics-server-audit-profiles\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.069636 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.069605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df709a0c-327c-4e03-976f-3eae1b7859fa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.071350 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.071327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-secret-metrics-server-tls\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.071828 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.071803 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-secret-metrics-server-client-certs\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.071929 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.071842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df709a0c-327c-4e03-976f-3eae1b7859fa-client-ca-bundle\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.077857 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.077839 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsz6t\" (UniqueName: \"kubernetes.io/projected/df709a0c-327c-4e03-976f-3eae1b7859fa-kube-api-access-dsz6t\") pod \"metrics-server-66fbd5dbbd-nwpr6\" (UID: \"df709a0c-327c-4e03-976f-3eae1b7859fa\") " pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.208429 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.208353 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:35.367256 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.367215 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6"] Apr 22 14:18:35.374362 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:35.374330 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf709a0c_327c_4e03_976f_3eae1b7859fa.slice/crio-567778f2ac456791587bad594d31814f04e2812edbf99977bf949e7e9792532e WatchSource:0}: Error finding container 567778f2ac456791587bad594d31814f04e2812edbf99977bf949e7e9792532e: Status 404 returned error can't find the container with id 567778f2ac456791587bad594d31814f04e2812edbf99977bf949e7e9792532e Apr 22 14:18:35.392208 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.392177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" event={"ID":"df709a0c-327c-4e03-976f-3eae1b7859fa","Type":"ContainerStarted","Data":"567778f2ac456791587bad594d31814f04e2812edbf99977bf949e7e9792532e"} Apr 22 14:18:35.395173 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.395147 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kngm6" event={"ID":"cfe6ba1b-0e9f-4a07-ae97-e903f41c1194","Type":"ContainerStarted","Data":"26a90bfc202a9cf59910bad7a19b7e59089e15dc5fe7995bb08f1564ef7dfa9d"} Apr 22 14:18:35.397468 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.396957 2573 generic.go:358] "Generic (PLEG): container finished" podID="4413c3ef-9f87-47de-bc69-50c496ba4b87" containerID="2fcdd6a8b37e9624145d9bcb52f95a8bd4230fe4ef30944c1af0cbe838fe67f0" exitCode=0 Apr 22 14:18:35.397468 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.397028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4413c3ef-9f87-47de-bc69-50c496ba4b87","Type":"ContainerDied","Data":"2fcdd6a8b37e9624145d9bcb52f95a8bd4230fe4ef30944c1af0cbe838fe67f0"} Apr 22 14:18:35.400450 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.400420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" event={"ID":"c820e8c6-75f5-4e79-9f8c-04e662cce3e8","Type":"ContainerStarted","Data":"b7c607feb44edcc20d53e606afd90978dcdccdb5d813dda1e26f1fbfcdc22858"} Apr 22 14:18:35.400547 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.400453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" event={"ID":"c820e8c6-75f5-4e79-9f8c-04e662cce3e8","Type":"ContainerStarted","Data":"50fc1544be04d0bf395a0bd98f6bd43fa8d73cb1e9a4f21301e73233e84c4d9a"} Apr 22 14:18:35.416221 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.416175 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kngm6" podStartSLOduration=4.189978508 podStartE2EDuration="5.416160688s" podCreationTimestamp="2026-04-22 14:18:30 +0000 UTC" firstStartedPulling="2026-04-22 14:18:30.946621968 +0000 UTC m=+195.744384568" lastFinishedPulling="2026-04-22 14:18:32.172804133 +0000 UTC m=+196.970566748" observedRunningTime="2026-04-22 14:18:35.415554655 +0000 UTC m=+200.213317279" watchObservedRunningTime="2026-04-22 14:18:35.416160688 +0000 UTC m=+200.213923310" Apr 22 14:18:35.434374 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:35.434331 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-v7xr8" podStartSLOduration=3.08828204 podStartE2EDuration="5.434316342s" podCreationTimestamp="2026-04-22 14:18:30 +0000 UTC" firstStartedPulling="2026-04-22 14:18:31.877703976 +0000 UTC m=+196.675466590" lastFinishedPulling="2026-04-22 14:18:34.223738275 +0000 UTC m=+199.021500892" observedRunningTime="2026-04-22 14:18:35.432465581 +0000 UTC m=+200.230228228" watchObservedRunningTime="2026-04-22 14:18:35.434316342 +0000 UTC m=+200.232078964" Apr 22 14:18:38.412527 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:38.412498 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" event={"ID":"bd638ff7-6608-49c3-8b03-48587688258d","Type":"ContainerStarted","Data":"cdeef38e257942d236c790ee3360476a393ebea474f346d4ab0d97aadfa9215c"} Apr 22 14:18:38.412884 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:38.412536 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" event={"ID":"bd638ff7-6608-49c3-8b03-48587688258d","Type":"ContainerStarted","Data":"8d1976153c9c34456eb9f711d65460b6c705a5940515a467a381cb4a5b073f1b"} Apr 22 14:18:38.413847 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:38.413819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" event={"ID":"df709a0c-327c-4e03-976f-3eae1b7859fa","Type":"ContainerStarted","Data":"7bf378692cc47549df5c11fd3d6008320a6c4664cc0b99bd8c584d77c32ffacc"} Apr 22 14:18:38.433645 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:38.433606 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" podStartSLOduration=1.801823251 podStartE2EDuration="4.433594734s" podCreationTimestamp="2026-04-22 14:18:34 +0000 UTC" firstStartedPulling="2026-04-22 14:18:35.376591823 +0000 UTC m=+200.174354439" lastFinishedPulling="2026-04-22 14:18:38.008363311 +0000 UTC m=+202.806125922" observedRunningTime="2026-04-22 14:18:38.431241864 +0000 UTC m=+203.229004487" watchObservedRunningTime="2026-04-22 14:18:38.433594734 +0000 UTC m=+203.231357359" Apr 22 14:18:39.419359 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:39.419308 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" event={"ID":"bd638ff7-6608-49c3-8b03-48587688258d","Type":"ContainerStarted","Data":"b32fbf9f677ff8a348ec12cc81dd85f226f7702eae26b13e1e7cfb9eacb8316d"} Apr 22 14:18:39.422297 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:39.422273 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4413c3ef-9f87-47de-bc69-50c496ba4b87","Type":"ContainerStarted","Data":"fe0ada6329164396cb13bb6040b2bd55196a26798792c2bd2420dd5dc82a9727"} Apr 22 14:18:39.422432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:39.422302 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4413c3ef-9f87-47de-bc69-50c496ba4b87","Type":"ContainerStarted","Data":"ac0d6dbd64926d3d04ae33978d5c5a6690c039b04b85fcf6eec43663ccb6f76a"} Apr 22 14:18:39.422432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:39.422317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4413c3ef-9f87-47de-bc69-50c496ba4b87","Type":"ContainerStarted","Data":"85b9642bea74ffcab1bcf706cf11d8b1e5c9282b729b0c98cca62ad1d8324ba4"} Apr 22 14:18:39.422432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:39.422334 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4413c3ef-9f87-47de-bc69-50c496ba4b87","Type":"ContainerStarted","Data":"1b157296a405d912f8ef1ee7b990e3e74d432ed320521b98699b05d84e95116a"} Apr 22 14:18:39.422432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:39.422344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4413c3ef-9f87-47de-bc69-50c496ba4b87","Type":"ContainerStarted","Data":"c64197420df3e0bfe5145c3b1eb4a30dbac33cfcd88c399fe2af1774f88e10c2"} Apr 22 14:18:40.337049 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.337017 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-644d77ffd8-gjwtj"] Apr 22 14:18:40.358441 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.358417 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-644d77ffd8-gjwtj"] Apr 22 14:18:40.358559 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.358545 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.365534 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.365510 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 14:18:40.415563 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.415537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbn4\" (UniqueName: \"kubernetes.io/projected/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-kube-api-access-rjbn4\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.415769 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.415575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-trusted-ca-bundle\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.415769 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.415709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-oauth-config\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.415769 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.415758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-config\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.415923 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.415780 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-service-ca\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.415923 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.415802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-oauth-serving-cert\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.415923 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.415839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-serving-cert\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.427208 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.427182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" event={"ID":"bd638ff7-6608-49c3-8b03-48587688258d","Type":"ContainerStarted","Data":"a3e2fa66999c66cccefa6759ad3979eb4952441db6a7af74282ae6e9af08595c"} Apr 22 14:18:40.427540 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.427218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" event={"ID":"bd638ff7-6608-49c3-8b03-48587688258d","Type":"ContainerStarted","Data":"00e9af6cd8c5f5bceab99889853e55c0b4e9152fff69bf6fa92181f2bb51e9ec"} Apr 22 14:18:40.427540 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.427230 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" event={"ID":"bd638ff7-6608-49c3-8b03-48587688258d","Type":"ContainerStarted","Data":"293f14b5e00408bd946f9acac9eb5dcef8ed350b8fd1828657b26b62918a6515"} Apr 22 14:18:40.427540 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.427326 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:40.430078 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.430054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4413c3ef-9f87-47de-bc69-50c496ba4b87","Type":"ContainerStarted","Data":"43ec7056ac7910aca2fe12838e597669fbd0d1f20a429352b318efa8103b8550"} Apr 22 14:18:40.449682 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.449614 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" podStartSLOduration=2.8058958130000002 podStartE2EDuration="8.449603765s" podCreationTimestamp="2026-04-22 14:18:32 +0000 UTC" firstStartedPulling="2026-04-22 14:18:34.368091232 +0000 UTC m=+199.165853835" lastFinishedPulling="2026-04-22 14:18:40.011799184 +0000 UTC m=+204.809561787" observedRunningTime="2026-04-22 14:18:40.447676281 +0000 UTC m=+205.245438926" watchObservedRunningTime="2026-04-22 14:18:40.449603765 +0000 UTC m=+205.247366385" Apr 22 14:18:40.471673 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.471634 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.628754812 podStartE2EDuration="9.471623317s" podCreationTimestamp="2026-04-22 14:18:31 +0000 UTC" firstStartedPulling="2026-04-22 14:18:32.169646336 +0000 UTC m=+196.967408945" lastFinishedPulling="2026-04-22 14:18:40.012514847 +0000 UTC m=+204.810277450" observedRunningTime="2026-04-22 14:18:40.470676133 +0000 UTC m=+205.268438754" watchObservedRunningTime="2026-04-22 14:18:40.471623317 +0000 UTC m=+205.269385937" Apr 22 14:18:40.516644 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.516618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-oauth-config\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.516792 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.516673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-config\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.516909 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.516886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-service-ca\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.516980 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.516924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-oauth-serving-cert\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.516980 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.516972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-serving-cert\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.518033 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.517107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbn4\" (UniqueName: \"kubernetes.io/projected/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-kube-api-access-rjbn4\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.518033 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.517143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-trusted-ca-bundle\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.518033 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.517406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-config\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.518033 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.517521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-service-ca\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.518033 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.517591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-oauth-serving-cert\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.518356 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.518338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-trusted-ca-bundle\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.519141 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.519121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-oauth-config\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.519340 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.519322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-serving-cert\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.525983 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.525961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbn4\" (UniqueName: \"kubernetes.io/projected/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-kube-api-access-rjbn4\") pod \"console-644d77ffd8-gjwtj\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.668537 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.668517 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:40.783374 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:40.783228 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-644d77ffd8-gjwtj"] Apr 22 14:18:40.785860 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:18:40.785829 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cf87100_c2b5_4349_9d2c_6ad0ad4d4254.slice/crio-e6134b6725d836260733cb0fe2f12297c53a187c15892c013930fa10268e8556 WatchSource:0}: Error finding container e6134b6725d836260733cb0fe2f12297c53a187c15892c013930fa10268e8556: Status 404 returned error can't find the container with id e6134b6725d836260733cb0fe2f12297c53a187c15892c013930fa10268e8556 Apr 22 14:18:41.434854 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:41.434823 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-644d77ffd8-gjwtj" event={"ID":"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254","Type":"ContainerStarted","Data":"96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9"} Apr 22 14:18:41.434854 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:41.434855 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-644d77ffd8-gjwtj" event={"ID":"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254","Type":"ContainerStarted","Data":"e6134b6725d836260733cb0fe2f12297c53a187c15892c013930fa10268e8556"} Apr 22 14:18:41.455027 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:41.454977 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-644d77ffd8-gjwtj" podStartSLOduration=1.454960306 podStartE2EDuration="1.454960306s" podCreationTimestamp="2026-04-22 14:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:18:41.454277904 +0000 UTC m=+206.252040522" watchObservedRunningTime="2026-04-22 14:18:41.454960306 +0000 UTC m=+206.252722927" Apr 22 14:18:46.441470 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:46.441438 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-bd7945d5d-b2m8s" Apr 22 14:18:50.669592 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:50.669553 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:50.670105 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:50.669643 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:50.674454 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:50.674434 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:51.469276 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:51.469248 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:18:54.565092 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.565031 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5768447b96-9pmff" podUID="57870da4-8b9b-45d3-a028-0ec65ce502ca" containerName="console" containerID="cri-o://38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71" gracePeriod=15 Apr 22 14:18:54.795368 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.795342 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5768447b96-9pmff_57870da4-8b9b-45d3-a028-0ec65ce502ca/console/0.log" Apr 22 14:18:54.795514 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.795426 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:54.835493 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.835406 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-config\") pod \"57870da4-8b9b-45d3-a028-0ec65ce502ca\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " Apr 22 14:18:54.835493 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.835446 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-serving-cert\") pod \"57870da4-8b9b-45d3-a028-0ec65ce502ca\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " Apr 22 14:18:54.835493 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.835471 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-oauth-config\") pod \"57870da4-8b9b-45d3-a028-0ec65ce502ca\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " Apr 22 14:18:54.835765 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.835528 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvmfj\" (UniqueName: \"kubernetes.io/projected/57870da4-8b9b-45d3-a028-0ec65ce502ca-kube-api-access-fvmfj\") pod \"57870da4-8b9b-45d3-a028-0ec65ce502ca\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " Apr 22 14:18:54.835765 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.835562 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-service-ca\") pod \"57870da4-8b9b-45d3-a028-0ec65ce502ca\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " Apr 22 14:18:54.835765 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.835661 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-oauth-serving-cert\") pod \"57870da4-8b9b-45d3-a028-0ec65ce502ca\" (UID: \"57870da4-8b9b-45d3-a028-0ec65ce502ca\") " Apr 22 14:18:54.835903 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.835868 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-config" (OuterVolumeSpecName: "console-config") pod "57870da4-8b9b-45d3-a028-0ec65ce502ca" (UID: "57870da4-8b9b-45d3-a028-0ec65ce502ca"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:54.835977 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.835953 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-service-ca" (OuterVolumeSpecName: "service-ca") pod "57870da4-8b9b-45d3-a028-0ec65ce502ca" (UID: "57870da4-8b9b-45d3-a028-0ec65ce502ca"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:54.836029 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.836016 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-config\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:54.836089 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.836028 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-service-ca\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:54.836272 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.836245 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "57870da4-8b9b-45d3-a028-0ec65ce502ca" (UID: "57870da4-8b9b-45d3-a028-0ec65ce502ca"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:54.837927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.837899 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "57870da4-8b9b-45d3-a028-0ec65ce502ca" (UID: "57870da4-8b9b-45d3-a028-0ec65ce502ca"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:54.837927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.837915 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "57870da4-8b9b-45d3-a028-0ec65ce502ca" (UID: "57870da4-8b9b-45d3-a028-0ec65ce502ca"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:54.838056 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.837954 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57870da4-8b9b-45d3-a028-0ec65ce502ca-kube-api-access-fvmfj" (OuterVolumeSpecName: "kube-api-access-fvmfj") pod "57870da4-8b9b-45d3-a028-0ec65ce502ca" (UID: "57870da4-8b9b-45d3-a028-0ec65ce502ca"). InnerVolumeSpecName "kube-api-access-fvmfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:54.855302 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.855263 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" podUID="5e58ca55-5638-4845-9cb8-959ad4a0d61f" containerName="registry" containerID="cri-o://c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b" gracePeriod=30 Apr 22 14:18:54.936698 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.936660 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-serving-cert\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:54.936913 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.936711 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57870da4-8b9b-45d3-a028-0ec65ce502ca-console-oauth-config\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:54.936913 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.936729 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fvmfj\" (UniqueName: \"kubernetes.io/projected/57870da4-8b9b-45d3-a028-0ec65ce502ca-kube-api-access-fvmfj\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:54.936913 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:54.936744 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57870da4-8b9b-45d3-a028-0ec65ce502ca-oauth-serving-cert\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.088368 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.088315 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:18:55.138757 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.138727 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-trusted-ca\") pod \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " Apr 22 14:18:55.138757 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.138774 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") pod \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " Apr 22 14:18:55.139000 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.138800 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-image-registry-private-configuration\") pod \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " Apr 22 14:18:55.139000 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.138820 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-installation-pull-secrets\") pod \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " Apr 22 14:18:55.139000 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.138845 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e58ca55-5638-4845-9cb8-959ad4a0d61f-ca-trust-extracted\") pod \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " Apr 22 14:18:55.139000 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.138881 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-certificates\") pod \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " Apr 22 14:18:55.139000 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.138899 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjmqs\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-kube-api-access-fjmqs\") pod \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " Apr 22 14:18:55.139000 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.138918 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-bound-sa-token\") pod \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\" (UID: \"5e58ca55-5638-4845-9cb8-959ad4a0d61f\") " Apr 22 14:18:55.139340 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.139309 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5e58ca55-5638-4845-9cb8-959ad4a0d61f" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:55.139422 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.139396 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5e58ca55-5638-4845-9cb8-959ad4a0d61f" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:55.141522 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.141489 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5e58ca55-5638-4845-9cb8-959ad4a0d61f" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:55.141522 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.141501 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5e58ca55-5638-4845-9cb8-959ad4a0d61f" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:55.141697 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.141577 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-kube-api-access-fjmqs" (OuterVolumeSpecName: "kube-api-access-fjmqs") pod "5e58ca55-5638-4845-9cb8-959ad4a0d61f" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f"). InnerVolumeSpecName "kube-api-access-fjmqs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:55.141697 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.141593 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5e58ca55-5638-4845-9cb8-959ad4a0d61f" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:55.141697 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.141641 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5e58ca55-5638-4845-9cb8-959ad4a0d61f" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:55.147643 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.147617 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e58ca55-5638-4845-9cb8-959ad4a0d61f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5e58ca55-5638-4845-9cb8-959ad4a0d61f" (UID: "5e58ca55-5638-4845-9cb8-959ad4a0d61f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:18:55.208937 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.208884 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:55.208937 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.208944 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:18:55.240182 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.240155 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-trusted-ca\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.240182 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.240181 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-tls\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.240305 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.240192 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-image-registry-private-configuration\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.240305 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.240202 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e58ca55-5638-4845-9cb8-959ad4a0d61f-installation-pull-secrets\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.240305 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.240213 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e58ca55-5638-4845-9cb8-959ad4a0d61f-ca-trust-extracted\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.240305 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.240222 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e58ca55-5638-4845-9cb8-959ad4a0d61f-registry-certificates\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.240305 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.240230 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjmqs\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-kube-api-access-fjmqs\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.240305 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.240239 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e58ca55-5638-4845-9cb8-959ad4a0d61f-bound-sa-token\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:18:55.478184 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.478093 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5768447b96-9pmff_57870da4-8b9b-45d3-a028-0ec65ce502ca/console/0.log" Apr 22 14:18:55.478184 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.478136 2573 generic.go:358] "Generic (PLEG): container finished" podID="57870da4-8b9b-45d3-a028-0ec65ce502ca" containerID="38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71" exitCode=2 Apr 22 14:18:55.478376 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.478200 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5768447b96-9pmff" Apr 22 14:18:55.478376 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.478220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5768447b96-9pmff" event={"ID":"57870da4-8b9b-45d3-a028-0ec65ce502ca","Type":"ContainerDied","Data":"38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71"} Apr 22 14:18:55.478376 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.478259 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5768447b96-9pmff" event={"ID":"57870da4-8b9b-45d3-a028-0ec65ce502ca","Type":"ContainerDied","Data":"01dbf7f11a018f4711fdc5586241a010c97c332cd7d0eb1e3d55adee1c0fabbc"} Apr 22 14:18:55.478376 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.478276 2573 scope.go:117] "RemoveContainer" containerID="38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71" Apr 22 14:18:55.479521 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.479438 2573 generic.go:358] "Generic (PLEG): container finished" podID="5e58ca55-5638-4845-9cb8-959ad4a0d61f" containerID="c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b" exitCode=0 Apr 22 14:18:55.479521 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.479492 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" event={"ID":"5e58ca55-5638-4845-9cb8-959ad4a0d61f","Type":"ContainerDied","Data":"c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b"} Apr 22 14:18:55.479521 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.479498 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" Apr 22 14:18:55.479521 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.479513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v" event={"ID":"5e58ca55-5638-4845-9cb8-959ad4a0d61f","Type":"ContainerDied","Data":"096c6a003e92ea2264494d58cf3690cf06733d8b3cc304b4438206d04339cf75"} Apr 22 14:18:55.486914 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.486888 2573 scope.go:117] "RemoveContainer" containerID="38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71" Apr 22 14:18:55.487197 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:55.487175 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71\": container with ID starting with 38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71 not found: ID does not exist" containerID="38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71" Apr 22 14:18:55.487238 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.487206 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71"} err="failed to get container status \"38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71\": rpc error: code = NotFound desc = could not find container \"38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71\": container with ID starting with 38ce46a00979575cec89b82d360deb84b9e72f2bc0cbe2379a066172bc2adb71 not found: ID does not exist" Apr 22 14:18:55.487289 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.487240 2573 scope.go:117] "RemoveContainer" containerID="c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b" Apr 22 14:18:55.494628 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.494611 2573 scope.go:117] "RemoveContainer" containerID="c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b" Apr 22 14:18:55.494952 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:18:55.494934 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b\": container with ID starting with c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b not found: ID does not exist" containerID="c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b" Apr 22 14:18:55.495026 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.494959 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b"} err="failed to get container status \"c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b\": rpc error: code = NotFound desc = could not find container \"c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b\": container with ID starting with c4aa91fe1660c70b04c58a7e743adf7ca7a8f2cc7b2153a0f75e96755f44608b not found: ID does not exist" Apr 22 14:18:55.501727 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.501702 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5768447b96-9pmff"] Apr 22 14:18:55.503961 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.503941 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5768447b96-9pmff"] Apr 22 14:18:55.513895 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.513854 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v"] Apr 22 14:18:55.515801 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.515775 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5dbc4fdcb4-5wb6v"] Apr 22 14:18:55.778091 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.778053 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57870da4-8b9b-45d3-a028-0ec65ce502ca" path="/var/lib/kubelet/pods/57870da4-8b9b-45d3-a028-0ec65ce502ca/volumes" Apr 22 14:18:55.778571 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:18:55.778556 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e58ca55-5638-4845-9cb8-959ad4a0d61f" path="/var/lib/kubelet/pods/5e58ca55-5638-4845-9cb8-959ad4a0d61f/volumes" Apr 22 14:19:02.502443 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:02.502415 2573 generic.go:358] "Generic (PLEG): container finished" podID="9de5847f-e323-4cd6-9aef-fde65fdaa5e2" containerID="886dba13d9250732e158f3646ed45eefa11f09d9982f2192151f8eabca4f41ad" exitCode=0 Apr 22 14:19:02.502841 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:02.502493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2zk62" event={"ID":"9de5847f-e323-4cd6-9aef-fde65fdaa5e2","Type":"ContainerDied","Data":"886dba13d9250732e158f3646ed45eefa11f09d9982f2192151f8eabca4f41ad"} Apr 22 14:19:02.502841 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:02.502814 2573 scope.go:117] "RemoveContainer" containerID="886dba13d9250732e158f3646ed45eefa11f09d9982f2192151f8eabca4f41ad" Apr 22 14:19:03.506672 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:03.506639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2zk62" event={"ID":"9de5847f-e323-4cd6-9aef-fde65fdaa5e2","Type":"ContainerStarted","Data":"a947321ee1697993294f76a5d4a06d16d27026e47b0f8c75ae5a1a5509fa008b"} Apr 22 14:19:12.534929 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:12.534892 2573 generic.go:358] "Generic (PLEG): container finished" podID="dfb9086c-f831-4140-bf45-0520130af0ae" containerID="b3062bb7ee49f09c98ffbc003a9976e75079d923d0ac3153db9711e178d25245" exitCode=0 Apr 22 14:19:12.535365 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:12.534967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" event={"ID":"dfb9086c-f831-4140-bf45-0520130af0ae","Type":"ContainerDied","Data":"b3062bb7ee49f09c98ffbc003a9976e75079d923d0ac3153db9711e178d25245"} Apr 22 14:19:12.535365 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:12.535280 2573 scope.go:117] "RemoveContainer" containerID="b3062bb7ee49f09c98ffbc003a9976e75079d923d0ac3153db9711e178d25245" Apr 22 14:19:13.540532 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:13.540499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9pz79" event={"ID":"dfb9086c-f831-4140-bf45-0520130af0ae","Type":"ContainerStarted","Data":"08aa1781b474827bf657f856c648b3885bf96a32b570a723ac3d12bb7ec7bd63"} Apr 22 14:19:15.215299 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:15.215262 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:19:15.219310 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:15.219288 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-66fbd5dbbd-nwpr6" Apr 22 14:19:27.510747 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:27.510705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:19:27.513215 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:27.513186 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d-metrics-certs\") pod \"network-metrics-daemon-qh8tk\" (UID: \"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d\") " pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:19:27.678657 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:27.678632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mfkfw\"" Apr 22 14:19:27.686610 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:27.686589 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qh8tk" Apr 22 14:19:27.805865 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:27.805295 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qh8tk"] Apr 22 14:19:27.810446 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:19:27.810412 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b80d7aa_1899_4bc8_94fb_e670ad1c3b3d.slice/crio-3562ed7ae4b5846f7578af71ce7eb3d85e81c773ec4260e6ee5dbf9a1f17b3fb WatchSource:0}: Error finding container 3562ed7ae4b5846f7578af71ce7eb3d85e81c773ec4260e6ee5dbf9a1f17b3fb: Status 404 returned error can't find the container with id 3562ed7ae4b5846f7578af71ce7eb3d85e81c773ec4260e6ee5dbf9a1f17b3fb Apr 22 14:19:28.586872 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:28.586826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qh8tk" event={"ID":"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d","Type":"ContainerStarted","Data":"3562ed7ae4b5846f7578af71ce7eb3d85e81c773ec4260e6ee5dbf9a1f17b3fb"} Apr 22 14:19:29.591180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:29.591148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qh8tk" event={"ID":"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d","Type":"ContainerStarted","Data":"057ba92a5dc869d75adb10f47425ab5d5d129b029acd408aea91ef0fcb165639"} Apr 22 14:19:29.591180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:29.591186 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qh8tk" event={"ID":"6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d","Type":"ContainerStarted","Data":"5f07c3d718722828012baa1400b7a51f649e0b10452e93cf14d58c2df97375e5"} Apr 22 14:19:29.607402 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:29.607359 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qh8tk" podStartSLOduration=252.625392288 podStartE2EDuration="4m13.60734555s" podCreationTimestamp="2026-04-22 14:15:16 +0000 UTC" firstStartedPulling="2026-04-22 14:19:27.812296906 +0000 UTC m=+252.610059506" lastFinishedPulling="2026-04-22 14:19:28.794250166 +0000 UTC m=+253.592012768" observedRunningTime="2026-04-22 14:19:29.605949479 +0000 UTC m=+254.403712093" watchObservedRunningTime="2026-04-22 14:19:29.60734555 +0000 UTC m=+254.405108169" Apr 22 14:19:50.453154 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.453081 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c7d65d555-qgblc"] Apr 22 14:19:50.453578 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.453456 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e58ca55-5638-4845-9cb8-959ad4a0d61f" containerName="registry" Apr 22 14:19:50.453578 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.453470 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e58ca55-5638-4845-9cb8-959ad4a0d61f" containerName="registry" Apr 22 14:19:50.453578 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.453480 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57870da4-8b9b-45d3-a028-0ec65ce502ca" containerName="console" Apr 22 14:19:50.453578 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.453485 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="57870da4-8b9b-45d3-a028-0ec65ce502ca" containerName="console" Apr 22 14:19:50.453578 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.453542 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e58ca55-5638-4845-9cb8-959ad4a0d61f" containerName="registry" Apr 22 14:19:50.453578 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.453550 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="57870da4-8b9b-45d3-a028-0ec65ce502ca" containerName="console" Apr 22 14:19:50.458062 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.458044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.467715 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.467676 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c7d65d555-qgblc"] Apr 22 14:19:50.580477 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.580454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d99bg\" (UniqueName: \"kubernetes.io/projected/a69bc128-590b-4933-aa63-ffa8ce995526-kube-api-access-d99bg\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.580585 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.580496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-console-config\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.580585 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.580513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-service-ca\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.580585 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.580565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-serving-cert\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.580762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.580611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-oauth-config\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.580762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.580631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-trusted-ca-bundle\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.580762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.580647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-oauth-serving-cert\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.681683 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.681659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-console-config\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.681813 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.681717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-service-ca\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.681813 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.681747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-serving-cert\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.681813 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.681786 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-oauth-config\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.681964 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.681814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-trusted-ca-bundle\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.681964 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.681837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-oauth-serving-cert\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.681964 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.681907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d99bg\" (UniqueName: \"kubernetes.io/projected/a69bc128-590b-4933-aa63-ffa8ce995526-kube-api-access-d99bg\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.684681 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.682653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-service-ca\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.684681 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.682769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-oauth-serving-cert\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.684681 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.682774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-console-config\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.684681 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.683446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-trusted-ca-bundle\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.688304 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.685331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-oauth-config\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.688879 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.688862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-serving-cert\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.690378 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.690358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d99bg\" (UniqueName: \"kubernetes.io/projected/a69bc128-590b-4933-aa63-ffa8ce995526-kube-api-access-d99bg\") pod \"console-c7d65d555-qgblc\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.768341 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.768318 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:19:50.889082 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:50.889054 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c7d65d555-qgblc"] Apr 22 14:19:50.892090 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:19:50.892061 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda69bc128_590b_4933_aa63_ffa8ce995526.slice/crio-a9f759706d05cb3fa273bac41dc8626be262334e080edfd743e6a7b3bbb0b11d WatchSource:0}: Error finding container a9f759706d05cb3fa273bac41dc8626be262334e080edfd743e6a7b3bbb0b11d: Status 404 returned error can't find the container with id a9f759706d05cb3fa273bac41dc8626be262334e080edfd743e6a7b3bbb0b11d Apr 22 14:19:51.664545 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:51.664460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7d65d555-qgblc" event={"ID":"a69bc128-590b-4933-aa63-ffa8ce995526","Type":"ContainerStarted","Data":"959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba"} Apr 22 14:19:51.664545 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:51.664503 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7d65d555-qgblc" event={"ID":"a69bc128-590b-4933-aa63-ffa8ce995526","Type":"ContainerStarted","Data":"a9f759706d05cb3fa273bac41dc8626be262334e080edfd743e6a7b3bbb0b11d"} Apr 22 14:19:51.681937 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:19:51.681892 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c7d65d555-qgblc" podStartSLOduration=1.681879334 podStartE2EDuration="1.681879334s" podCreationTimestamp="2026-04-22 14:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:19:51.679896447 +0000 UTC m=+276.477659070" watchObservedRunningTime="2026-04-22 14:19:51.681879334 +0000 UTC m=+276.479641954" Apr 22 14:20:00.769213 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:00.769180 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:20:00.769213 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:00.769218 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:20:00.773635 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:00.773613 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:20:01.697323 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:01.697295 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:20:01.746442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:01.746413 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-644d77ffd8-gjwtj"] Apr 22 14:20:15.659411 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:15.659379 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:20:15.661320 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:15.661298 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:20:26.764743 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:26.764651 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-644d77ffd8-gjwtj" podUID="3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" containerName="console" containerID="cri-o://96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9" gracePeriod=15 Apr 22 14:20:27.000486 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.000466 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-644d77ffd8-gjwtj_3cf87100-c2b5-4349-9d2c-6ad0ad4d4254/console/0.log" Apr 22 14:20:27.000580 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.000523 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:20:27.148956 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.148929 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-trusted-ca-bundle\") pod \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " Apr 22 14:20:27.149095 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.148994 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjbn4\" (UniqueName: \"kubernetes.io/projected/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-kube-api-access-rjbn4\") pod \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " Apr 22 14:20:27.149095 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149013 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-oauth-config\") pod \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " Apr 22 14:20:27.149095 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149028 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-serving-cert\") pod \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " Apr 22 14:20:27.149095 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149048 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-service-ca\") pod \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " Apr 22 14:20:27.149095 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149070 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-config\") pod \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " Apr 22 14:20:27.149095 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149084 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-oauth-serving-cert\") pod \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\" (UID: \"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254\") " Apr 22 14:20:27.149468 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149429 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" (UID: "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:27.149554 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149528 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-service-ca" (OuterVolumeSpecName: "service-ca") pod "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" (UID: "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:27.149554 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149542 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" (UID: "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:27.149641 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.149545 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-config" (OuterVolumeSpecName: "console-config") pod "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" (UID: "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:27.151174 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.151146 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" (UID: "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:20:27.151501 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.151471 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" (UID: "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:20:27.151501 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.151484 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-kube-api-access-rjbn4" (OuterVolumeSpecName: "kube-api-access-rjbn4") pod "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" (UID: "3cf87100-c2b5-4349-9d2c-6ad0ad4d4254"). InnerVolumeSpecName "kube-api-access-rjbn4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:20:27.250402 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.250385 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-config\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:20:27.250487 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.250403 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-oauth-serving-cert\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:20:27.250487 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.250416 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-trusted-ca-bundle\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:20:27.250487 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.250425 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rjbn4\" (UniqueName: \"kubernetes.io/projected/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-kube-api-access-rjbn4\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:20:27.250487 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.250434 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-oauth-config\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:20:27.250487 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.250442 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-console-serving-cert\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:20:27.250487 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.250450 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254-service-ca\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:20:27.776128 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.776098 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-644d77ffd8-gjwtj_3cf87100-c2b5-4349-9d2c-6ad0ad4d4254/console/0.log" Apr 22 14:20:27.776473 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.776145 2573 generic.go:358] "Generic (PLEG): container finished" podID="3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" containerID="96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9" exitCode=2 Apr 22 14:20:27.776473 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.776293 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-644d77ffd8-gjwtj" Apr 22 14:20:27.777482 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.777462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-644d77ffd8-gjwtj" event={"ID":"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254","Type":"ContainerDied","Data":"96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9"} Apr 22 14:20:27.777548 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.777491 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-644d77ffd8-gjwtj" event={"ID":"3cf87100-c2b5-4349-9d2c-6ad0ad4d4254","Type":"ContainerDied","Data":"e6134b6725d836260733cb0fe2f12297c53a187c15892c013930fa10268e8556"} Apr 22 14:20:27.777548 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.777508 2573 scope.go:117] "RemoveContainer" containerID="96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9" Apr 22 14:20:27.786523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.786502 2573 scope.go:117] "RemoveContainer" containerID="96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9" Apr 22 14:20:27.786883 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:20:27.786859 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9\": container with ID starting with 96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9 not found: ID does not exist" containerID="96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9" Apr 22 14:20:27.786988 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.786889 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9"} err="failed to get container status \"96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9\": rpc error: code = NotFound desc = could not find container \"96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9\": container with ID starting with 96e67e3fd4b7c12fde2745257f11dedbd7ff9f1c1d66700bf26e81c2a3935ab9 not found: ID does not exist" Apr 22 14:20:27.799566 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.799547 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-644d77ffd8-gjwtj"] Apr 22 14:20:27.803628 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:27.803605 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-644d77ffd8-gjwtj"] Apr 22 14:20:29.778067 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:20:29.778031 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" path="/var/lib/kubelet/pods/3cf87100-c2b5-4349-9d2c-6ad0ad4d4254/volumes" Apr 22 14:21:29.964092 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.964061 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz"] Apr 22 14:21:29.964520 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.964393 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" containerName="console" Apr 22 14:21:29.964520 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.964404 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" containerName="console" Apr 22 14:21:29.964520 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.964449 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3cf87100-c2b5-4349-9d2c-6ad0ad4d4254" containerName="console" Apr 22 14:21:29.968624 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.968604 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:29.971124 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.971092 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-smndj\"" Apr 22 14:21:29.971204 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.971125 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:21:29.971717 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.971702 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:21:29.976624 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:29.976606 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz"] Apr 22 14:21:30.121262 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.121225 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgmsn\" (UniqueName: \"kubernetes.io/projected/75f6aa44-e28c-4e8b-bf73-479a34f01f44-kube-api-access-pgmsn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.121262 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.121268 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.121457 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.121296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.222573 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.222498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.222573 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.222533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.222788 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.222603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgmsn\" (UniqueName: \"kubernetes.io/projected/75f6aa44-e28c-4e8b-bf73-479a34f01f44-kube-api-access-pgmsn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.222902 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.222884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.222975 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.222953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.235003 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.234977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgmsn\" (UniqueName: \"kubernetes.io/projected/75f6aa44-e28c-4e8b-bf73-479a34f01f44-kube-api-access-pgmsn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.278964 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.278924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:30.404379 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.404356 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz"] Apr 22 14:21:30.410321 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.410300 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:21:30.955871 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:30.955829 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" event={"ID":"75f6aa44-e28c-4e8b-bf73-479a34f01f44","Type":"ContainerStarted","Data":"1d23fb5d6e4785e4705cb2362a04ba442e723f91ff16be4977c25111498da4a7"} Apr 22 14:21:35.973183 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:35.973146 2573 generic.go:358] "Generic (PLEG): container finished" podID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerID="6a9400365cb81af65fa2b66b72a4f6edfb4a8742212082de99676b92bf4ba55e" exitCode=0 Apr 22 14:21:35.973554 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:35.973212 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" event={"ID":"75f6aa44-e28c-4e8b-bf73-479a34f01f44","Type":"ContainerDied","Data":"6a9400365cb81af65fa2b66b72a4f6edfb4a8742212082de99676b92bf4ba55e"} Apr 22 14:21:37.981716 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:37.981613 2573 generic.go:358] "Generic (PLEG): container finished" podID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerID="d75fb4bdf2a253e5cb77cca9b009b3fae0b211051de5be7ff7d26c77059db49e" exitCode=0 Apr 22 14:21:37.981716 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:37.981656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" event={"ID":"75f6aa44-e28c-4e8b-bf73-479a34f01f44","Type":"ContainerDied","Data":"d75fb4bdf2a253e5cb77cca9b009b3fae0b211051de5be7ff7d26c77059db49e"} Apr 22 14:21:44.001432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:44.001355 2573 generic.go:358] "Generic (PLEG): container finished" podID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerID="fed6489ceadfb0a3feb0e865e19aa6c3ece99c4f99f389394d32597e43e99155" exitCode=0 Apr 22 14:21:44.001432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:44.001402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" event={"ID":"75f6aa44-e28c-4e8b-bf73-479a34f01f44","Type":"ContainerDied","Data":"fed6489ceadfb0a3feb0e865e19aa6c3ece99c4f99f389394d32597e43e99155"} Apr 22 14:21:45.117020 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.116997 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:45.159277 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.159245 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-util\") pod \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " Apr 22 14:21:45.159431 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.159294 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-bundle\") pod \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " Apr 22 14:21:45.159431 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.159349 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgmsn\" (UniqueName: \"kubernetes.io/projected/75f6aa44-e28c-4e8b-bf73-479a34f01f44-kube-api-access-pgmsn\") pod \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\" (UID: \"75f6aa44-e28c-4e8b-bf73-479a34f01f44\") " Apr 22 14:21:45.159872 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.159840 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-bundle" (OuterVolumeSpecName: "bundle") pod "75f6aa44-e28c-4e8b-bf73-479a34f01f44" (UID: "75f6aa44-e28c-4e8b-bf73-479a34f01f44"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:21:45.161486 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.161462 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f6aa44-e28c-4e8b-bf73-479a34f01f44-kube-api-access-pgmsn" (OuterVolumeSpecName: "kube-api-access-pgmsn") pod "75f6aa44-e28c-4e8b-bf73-479a34f01f44" (UID: "75f6aa44-e28c-4e8b-bf73-479a34f01f44"). InnerVolumeSpecName "kube-api-access-pgmsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:21:45.163148 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.163126 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-util" (OuterVolumeSpecName: "util") pod "75f6aa44-e28c-4e8b-bf73-479a34f01f44" (UID: "75f6aa44-e28c-4e8b-bf73-479a34f01f44"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:21:45.260032 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.259940 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-util\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:21:45.260032 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.259980 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75f6aa44-e28c-4e8b-bf73-479a34f01f44-bundle\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:21:45.260032 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:45.259990 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgmsn\" (UniqueName: \"kubernetes.io/projected/75f6aa44-e28c-4e8b-bf73-479a34f01f44-kube-api-access-pgmsn\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:21:46.008822 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:46.008785 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" event={"ID":"75f6aa44-e28c-4e8b-bf73-479a34f01f44","Type":"ContainerDied","Data":"1d23fb5d6e4785e4705cb2362a04ba442e723f91ff16be4977c25111498da4a7"} Apr 22 14:21:46.008822 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:46.008818 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d23fb5d6e4785e4705cb2362a04ba442e723f91ff16be4977c25111498da4a7" Apr 22 14:21:46.008822 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:46.008828 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgjdz" Apr 22 14:21:52.790254 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.790215 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2"] Apr 22 14:21:52.790762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.790570 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerName="util" Apr 22 14:21:52.790762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.790582 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerName="util" Apr 22 14:21:52.790762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.790592 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerName="extract" Apr 22 14:21:52.790762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.790597 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerName="extract" Apr 22 14:21:52.790762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.790604 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerName="pull" Apr 22 14:21:52.790762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.790609 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerName="pull" Apr 22 14:21:52.790762 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.790666 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="75f6aa44-e28c-4e8b-bf73-479a34f01f44" containerName="extract" Apr 22 14:21:52.797378 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.797360 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:52.800575 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.800555 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 14:21:52.801216 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.801186 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 14:21:52.801313 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.801283 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 14:21:52.801313 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.801294 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-fpd94\"" Apr 22 14:21:52.809770 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.809751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqwlm\" (UniqueName: \"kubernetes.io/projected/d9012bbd-2ec3-4867-b2ed-d4a5744124a1-kube-api-access-fqwlm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2\" (UID: \"d9012bbd-2ec3-4867-b2ed-d4a5744124a1\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:52.809868 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.809819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d9012bbd-2ec3-4867-b2ed-d4a5744124a1-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2\" (UID: \"d9012bbd-2ec3-4867-b2ed-d4a5744124a1\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:52.812093 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.812073 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2"] Apr 22 14:21:52.910770 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.910739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d9012bbd-2ec3-4867-b2ed-d4a5744124a1-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2\" (UID: \"d9012bbd-2ec3-4867-b2ed-d4a5744124a1\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:52.910927 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.910783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqwlm\" (UniqueName: \"kubernetes.io/projected/d9012bbd-2ec3-4867-b2ed-d4a5744124a1-kube-api-access-fqwlm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2\" (UID: \"d9012bbd-2ec3-4867-b2ed-d4a5744124a1\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:52.913068 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.913039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d9012bbd-2ec3-4867-b2ed-d4a5744124a1-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2\" (UID: \"d9012bbd-2ec3-4867-b2ed-d4a5744124a1\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:52.933383 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:52.933360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqwlm\" (UniqueName: \"kubernetes.io/projected/d9012bbd-2ec3-4867-b2ed-d4a5744124a1-kube-api-access-fqwlm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2\" (UID: \"d9012bbd-2ec3-4867-b2ed-d4a5744124a1\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:53.107356 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:53.107277 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:53.238147 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:53.238100 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2"] Apr 22 14:21:53.241344 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:21:53.241314 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9012bbd_2ec3_4867_b2ed_d4a5744124a1.slice/crio-b59feb8edc1295f5f9f874d148efd73e94971c8b9835dd2c6ec8e21eb80a794f WatchSource:0}: Error finding container b59feb8edc1295f5f9f874d148efd73e94971c8b9835dd2c6ec8e21eb80a794f: Status 404 returned error can't find the container with id b59feb8edc1295f5f9f874d148efd73e94971c8b9835dd2c6ec8e21eb80a794f Apr 22 14:21:54.037015 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:54.036968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" event={"ID":"d9012bbd-2ec3-4867-b2ed-d4a5744124a1","Type":"ContainerStarted","Data":"b59feb8edc1295f5f9f874d148efd73e94971c8b9835dd2c6ec8e21eb80a794f"} Apr 22 14:21:56.878197 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.878166 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2xh4t"] Apr 22 14:21:56.881633 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.881617 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:56.884088 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.884071 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 14:21:56.884296 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.884269 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 14:21:56.884385 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.884292 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-gtznj\"" Apr 22 14:21:56.892465 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.892440 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2xh4t"] Apr 22 14:21:56.939839 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.939813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v56dv\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-kube-api-access-v56dv\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:56.940003 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.939913 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/26426a9c-b1b0-4f89-a02f-1e912bf03ece-cabundle0\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:56.940003 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:56.939956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:57.040262 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.040233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/26426a9c-b1b0-4f89-a02f-1e912bf03ece-cabundle0\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:57.040437 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.040281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:57.040437 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.040315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v56dv\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-kube-api-access-v56dv\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:57.040437 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.040404 2573 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 14:21:57.040437 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.040426 2573 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:57.040437 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.040436 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:57.040706 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.040451 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2xh4t: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:57.040706 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.040509 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates podName:26426a9c-b1b0-4f89-a02f-1e912bf03ece nodeName:}" failed. No retries permitted until 2026-04-22 14:21:57.540491204 +0000 UTC m=+402.338253810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates") pod "keda-operator-ffbb595cb-2xh4t" (UID: "26426a9c-b1b0-4f89-a02f-1e912bf03ece") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:57.040943 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.040926 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/26426a9c-b1b0-4f89-a02f-1e912bf03ece-cabundle0\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:57.049618 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.049582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" event={"ID":"d9012bbd-2ec3-4867-b2ed-d4a5744124a1","Type":"ContainerStarted","Data":"e2bc90b02ffa9b18b1ffcbdf9adac38e373735f9947db096d99729b636405614"} Apr 22 14:21:57.049806 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.049783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v56dv\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-kube-api-access-v56dv\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:57.049962 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.049805 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:21:57.074918 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.074878 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" podStartSLOduration=1.968340593 podStartE2EDuration="5.074866303s" podCreationTimestamp="2026-04-22 14:21:52 +0000 UTC" firstStartedPulling="2026-04-22 14:21:53.242986684 +0000 UTC m=+398.040749283" lastFinishedPulling="2026-04-22 14:21:56.349512389 +0000 UTC m=+401.147274993" observedRunningTime="2026-04-22 14:21:57.072505946 +0000 UTC m=+401.870268569" watchObservedRunningTime="2026-04-22 14:21:57.074866303 +0000 UTC m=+401.872628924" Apr 22 14:21:57.544495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.544461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:57.544658 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.544622 2573 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 14:21:57.544658 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.544646 2573 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:57.544658 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.544657 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:57.544813 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.544669 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2xh4t: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:57.544813 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:57.544746 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates podName:26426a9c-b1b0-4f89-a02f-1e912bf03ece nodeName:}" failed. No retries permitted until 2026-04-22 14:21:58.544727853 +0000 UTC m=+403.342490466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates") pod "keda-operator-ffbb595cb-2xh4t" (UID: "26426a9c-b1b0-4f89-a02f-1e912bf03ece") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:57.591671 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.591643 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-4gmwb"] Apr 22 14:21:57.595429 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.595411 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:21:57.597432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.597409 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 14:21:57.605897 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.605874 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-4gmwb"] Apr 22 14:21:57.645519 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.645486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4a5fc734-c6ca-429d-a96d-2a7659886439-certificates\") pod \"keda-admission-cf49989db-4gmwb\" (UID: \"4a5fc734-c6ca-429d-a96d-2a7659886439\") " pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:21:57.645662 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.645546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9cb5\" (UniqueName: \"kubernetes.io/projected/4a5fc734-c6ca-429d-a96d-2a7659886439-kube-api-access-s9cb5\") pod \"keda-admission-cf49989db-4gmwb\" (UID: \"4a5fc734-c6ca-429d-a96d-2a7659886439\") " pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:21:57.746896 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.746859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4a5fc734-c6ca-429d-a96d-2a7659886439-certificates\") pod \"keda-admission-cf49989db-4gmwb\" (UID: \"4a5fc734-c6ca-429d-a96d-2a7659886439\") " pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:21:57.747081 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.746927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9cb5\" (UniqueName: \"kubernetes.io/projected/4a5fc734-c6ca-429d-a96d-2a7659886439-kube-api-access-s9cb5\") pod \"keda-admission-cf49989db-4gmwb\" (UID: \"4a5fc734-c6ca-429d-a96d-2a7659886439\") " pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:21:57.751493 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.751441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4a5fc734-c6ca-429d-a96d-2a7659886439-certificates\") pod \"keda-admission-cf49989db-4gmwb\" (UID: \"4a5fc734-c6ca-429d-a96d-2a7659886439\") " pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:21:57.761966 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.761934 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9cb5\" (UniqueName: \"kubernetes.io/projected/4a5fc734-c6ca-429d-a96d-2a7659886439-kube-api-access-s9cb5\") pod \"keda-admission-cf49989db-4gmwb\" (UID: \"4a5fc734-c6ca-429d-a96d-2a7659886439\") " pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:21:57.906860 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:57.906778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:21:58.050618 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:58.050163 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-4gmwb"] Apr 22 14:21:58.054282 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:21:58.054257 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a5fc734_c6ca_429d_a96d_2a7659886439.slice/crio-c4dfdfd45fb7bca9ba8efe6cc7a37ad2dfbd7077162fc2529d97a5bb7151de40 WatchSource:0}: Error finding container c4dfdfd45fb7bca9ba8efe6cc7a37ad2dfbd7077162fc2529d97a5bb7151de40: Status 404 returned error can't find the container with id c4dfdfd45fb7bca9ba8efe6cc7a37ad2dfbd7077162fc2529d97a5bb7151de40 Apr 22 14:21:58.555185 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:58.555144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:21:58.555376 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:58.555296 2573 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:58.555376 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:58.555314 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:58.555376 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:58.555325 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2xh4t: references non-existent secret key: ca.crt Apr 22 14:21:58.555529 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:21:58.555381 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates podName:26426a9c-b1b0-4f89-a02f-1e912bf03ece nodeName:}" failed. No retries permitted until 2026-04-22 14:22:00.555364083 +0000 UTC m=+405.353126683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates") pod "keda-operator-ffbb595cb-2xh4t" (UID: "26426a9c-b1b0-4f89-a02f-1e912bf03ece") : references non-existent secret key: ca.crt Apr 22 14:21:59.064099 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:21:59.059780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-4gmwb" event={"ID":"4a5fc734-c6ca-429d-a96d-2a7659886439","Type":"ContainerStarted","Data":"c4dfdfd45fb7bca9ba8efe6cc7a37ad2dfbd7077162fc2529d97a5bb7151de40"} Apr 22 14:22:00.063933 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:00.063895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-4gmwb" event={"ID":"4a5fc734-c6ca-429d-a96d-2a7659886439","Type":"ContainerStarted","Data":"2250c41c928a1e4fb33c9c22e11f6bc4c8433f67c0a1c6e17f14cc9203e111fd"} Apr 22 14:22:00.064152 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:00.064053 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:22:00.083278 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:00.083229 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-4gmwb" podStartSLOduration=1.8656939879999999 podStartE2EDuration="3.08321602s" podCreationTimestamp="2026-04-22 14:21:57 +0000 UTC" firstStartedPulling="2026-04-22 14:21:58.055730182 +0000 UTC m=+402.853492781" lastFinishedPulling="2026-04-22 14:21:59.2732522 +0000 UTC m=+404.071014813" observedRunningTime="2026-04-22 14:22:00.08033671 +0000 UTC m=+404.878099332" watchObservedRunningTime="2026-04-22 14:22:00.08321602 +0000 UTC m=+404.880978640" Apr 22 14:22:00.571429 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:00.571391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:22:00.571593 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:22:00.571534 2573 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:22:00.571593 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:22:00.571552 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:22:00.571593 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:22:00.571560 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2xh4t: references non-existent secret key: ca.crt Apr 22 14:22:00.571719 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:22:00.571612 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates podName:26426a9c-b1b0-4f89-a02f-1e912bf03ece nodeName:}" failed. No retries permitted until 2026-04-22 14:22:04.571596569 +0000 UTC m=+409.369359173 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates") pod "keda-operator-ffbb595cb-2xh4t" (UID: "26426a9c-b1b0-4f89-a02f-1e912bf03ece") : references non-existent secret key: ca.crt Apr 22 14:22:04.601019 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:04.600983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:22:04.603328 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:04.603308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/26426a9c-b1b0-4f89-a02f-1e912bf03ece-certificates\") pod \"keda-operator-ffbb595cb-2xh4t\" (UID: \"26426a9c-b1b0-4f89-a02f-1e912bf03ece\") " pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:22:04.692440 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:04.692390 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:22:04.812533 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:04.812504 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2xh4t"] Apr 22 14:22:04.815060 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:22:04.815034 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26426a9c_b1b0_4f89_a02f_1e912bf03ece.slice/crio-017a769b9b4d5d63a8a09a71f4bd249edb40c82245f651a63c0a49ac5b04a6f6 WatchSource:0}: Error finding container 017a769b9b4d5d63a8a09a71f4bd249edb40c82245f651a63c0a49ac5b04a6f6: Status 404 returned error can't find the container with id 017a769b9b4d5d63a8a09a71f4bd249edb40c82245f651a63c0a49ac5b04a6f6 Apr 22 14:22:05.080598 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:05.080566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" event={"ID":"26426a9c-b1b0-4f89-a02f-1e912bf03ece","Type":"ContainerStarted","Data":"017a769b9b4d5d63a8a09a71f4bd249edb40c82245f651a63c0a49ac5b04a6f6"} Apr 22 14:22:11.102837 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:11.102798 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" event={"ID":"26426a9c-b1b0-4f89-a02f-1e912bf03ece","Type":"ContainerStarted","Data":"5bd1dc7b477b998163c4b7384752e3be18d94bc260b92a5afab8ab177682a1ed"} Apr 22 14:22:11.103181 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:11.102878 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:22:11.131555 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:11.131512 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" podStartSLOduration=9.762017905 podStartE2EDuration="15.131498281s" podCreationTimestamp="2026-04-22 14:21:56 +0000 UTC" firstStartedPulling="2026-04-22 14:22:04.816400632 +0000 UTC m=+409.614163231" lastFinishedPulling="2026-04-22 14:22:10.185881004 +0000 UTC m=+414.983643607" observedRunningTime="2026-04-22 14:22:11.130250404 +0000 UTC m=+415.928013026" watchObservedRunningTime="2026-04-22 14:22:11.131498281 +0000 UTC m=+415.929260923" Apr 22 14:22:18.056391 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:18.056358 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ls2q2" Apr 22 14:22:21.069121 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:21.069090 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-4gmwb" Apr 22 14:22:32.109347 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:22:32.109315 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-2xh4t" Apr 22 14:23:13.430880 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.430842 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-qrqbq"] Apr 22 14:23:13.434317 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.434299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:13.436776 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.436746 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 14:23:13.436889 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.436784 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:23:13.437441 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.437422 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-mjf75\"" Apr 22 14:23:13.437543 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.437460 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:23:13.451126 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.451104 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-qrqbq"] Apr 22 14:23:13.569954 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.569923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4592166-9dd7-418d-b2fa-2db51e9b9852-cert\") pod \"kserve-controller-manager-66cf78b85b-qrqbq\" (UID: \"c4592166-9dd7-418d-b2fa-2db51e9b9852\") " pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:13.570113 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.569982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhbks\" (UniqueName: \"kubernetes.io/projected/c4592166-9dd7-418d-b2fa-2db51e9b9852-kube-api-access-mhbks\") pod \"kserve-controller-manager-66cf78b85b-qrqbq\" (UID: \"c4592166-9dd7-418d-b2fa-2db51e9b9852\") " pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:13.670868 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.670842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4592166-9dd7-418d-b2fa-2db51e9b9852-cert\") pod \"kserve-controller-manager-66cf78b85b-qrqbq\" (UID: \"c4592166-9dd7-418d-b2fa-2db51e9b9852\") " pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:13.671008 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.670895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhbks\" (UniqueName: \"kubernetes.io/projected/c4592166-9dd7-418d-b2fa-2db51e9b9852-kube-api-access-mhbks\") pod \"kserve-controller-manager-66cf78b85b-qrqbq\" (UID: \"c4592166-9dd7-418d-b2fa-2db51e9b9852\") " pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:13.673242 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.673213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4592166-9dd7-418d-b2fa-2db51e9b9852-cert\") pod \"kserve-controller-manager-66cf78b85b-qrqbq\" (UID: \"c4592166-9dd7-418d-b2fa-2db51e9b9852\") " pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:13.680758 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.680730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhbks\" (UniqueName: \"kubernetes.io/projected/c4592166-9dd7-418d-b2fa-2db51e9b9852-kube-api-access-mhbks\") pod \"kserve-controller-manager-66cf78b85b-qrqbq\" (UID: \"c4592166-9dd7-418d-b2fa-2db51e9b9852\") " pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:13.744937 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.744872 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:13.870201 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:13.870178 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-qrqbq"] Apr 22 14:23:13.873484 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:23:13.873455 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4592166_9dd7_418d_b2fa_2db51e9b9852.slice/crio-a134099e323d0e1df2b1a13d2707a90419fa603eeeea34d1e5e6dae4af94ca5f WatchSource:0}: Error finding container a134099e323d0e1df2b1a13d2707a90419fa603eeeea34d1e5e6dae4af94ca5f: Status 404 returned error can't find the container with id a134099e323d0e1df2b1a13d2707a90419fa603eeeea34d1e5e6dae4af94ca5f Apr 22 14:23:14.312742 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:14.312707 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" event={"ID":"c4592166-9dd7-418d-b2fa-2db51e9b9852","Type":"ContainerStarted","Data":"a134099e323d0e1df2b1a13d2707a90419fa603eeeea34d1e5e6dae4af94ca5f"} Apr 22 14:23:16.321739 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:16.321708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" event={"ID":"c4592166-9dd7-418d-b2fa-2db51e9b9852","Type":"ContainerStarted","Data":"72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141"} Apr 22 14:23:16.322099 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:16.321802 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:16.345594 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:16.345553 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" podStartSLOduration=1.018352971 podStartE2EDuration="3.345540362s" podCreationTimestamp="2026-04-22 14:23:13 +0000 UTC" firstStartedPulling="2026-04-22 14:23:13.875157212 +0000 UTC m=+478.672919812" lastFinishedPulling="2026-04-22 14:23:16.20234459 +0000 UTC m=+481.000107203" observedRunningTime="2026-04-22 14:23:16.345044398 +0000 UTC m=+481.142807021" watchObservedRunningTime="2026-04-22 14:23:16.345540362 +0000 UTC m=+481.143302983" Apr 22 14:23:40.773556 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.773517 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d9b896989-qnxsz"] Apr 22 14:23:40.776944 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.776928 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.792369 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.792345 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d9b896989-qnxsz"] Apr 22 14:23:40.897307 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.897270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-config\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.897477 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.897313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-oauth-serving-cert\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.897477 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.897337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9dmz\" (UniqueName: \"kubernetes.io/projected/79717c4c-c53e-450a-b5ff-ce14751d7d43-kube-api-access-m9dmz\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.897477 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.897398 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-service-ca\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.897477 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.897431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-trusted-ca-bundle\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.897670 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.897497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-serving-cert\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.897670 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.897539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-oauth-config\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.997974 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.997947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-config\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.997974 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.997980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-oauth-serving-cert\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.997999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9dmz\" (UniqueName: \"kubernetes.io/projected/79717c4c-c53e-450a-b5ff-ce14751d7d43-kube-api-access-m9dmz\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.998028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-service-ca\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.998057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-trusted-ca-bundle\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.998093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-serving-cert\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998212 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.998128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-oauth-config\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998672 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.998643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-oauth-serving-cert\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998810 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.998770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-config\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998950 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.998929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-service-ca\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:40.998994 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:40.998938 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79717c4c-c53e-450a-b5ff-ce14751d7d43-trusted-ca-bundle\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:41.000427 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:41.000405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-oauth-config\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:41.000519 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:41.000497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/79717c4c-c53e-450a-b5ff-ce14751d7d43-console-serving-cert\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:41.005908 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:41.005892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9dmz\" (UniqueName: \"kubernetes.io/projected/79717c4c-c53e-450a-b5ff-ce14751d7d43-kube-api-access-m9dmz\") pod \"console-d9b896989-qnxsz\" (UID: \"79717c4c-c53e-450a-b5ff-ce14751d7d43\") " pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:41.086519 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:41.086455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:41.212061 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:41.211867 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d9b896989-qnxsz"] Apr 22 14:23:41.212543 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:23:41.212515 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79717c4c_c53e_450a_b5ff_ce14751d7d43.slice/crio-419ba5e40e67a7c01a994dcbded928d31ab8cb77a28edffd52405a7849c86096 WatchSource:0}: Error finding container 419ba5e40e67a7c01a994dcbded928d31ab8cb77a28edffd52405a7849c86096: Status 404 returned error can't find the container with id 419ba5e40e67a7c01a994dcbded928d31ab8cb77a28edffd52405a7849c86096 Apr 22 14:23:41.411451 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:41.411368 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9b896989-qnxsz" event={"ID":"79717c4c-c53e-450a-b5ff-ce14751d7d43","Type":"ContainerStarted","Data":"062de9470d3388cdcec2759d976c7da9e237215bf205b076a8ff1708be921f32"} Apr 22 14:23:41.411451 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:41.411404 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9b896989-qnxsz" event={"ID":"79717c4c-c53e-450a-b5ff-ce14751d7d43","Type":"ContainerStarted","Data":"419ba5e40e67a7c01a994dcbded928d31ab8cb77a28edffd52405a7849c86096"} Apr 22 14:23:41.430807 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:41.430758 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d9b896989-qnxsz" podStartSLOduration=1.430744029 podStartE2EDuration="1.430744029s" podCreationTimestamp="2026-04-22 14:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:23:41.429702884 +0000 UTC m=+506.227465499" watchObservedRunningTime="2026-04-22 14:23:41.430744029 +0000 UTC m=+506.228506649" Apr 22 14:23:47.330820 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:47.330787 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:49.936608 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:49.936578 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-qrqbq"] Apr 22 14:23:49.937013 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:49.936824 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" podUID="c4592166-9dd7-418d-b2fa-2db51e9b9852" containerName="manager" containerID="cri-o://72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141" gracePeriod=10 Apr 22 14:23:49.966849 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:49.966824 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-sdmgs"] Apr 22 14:23:49.986633 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:49.986607 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-sdmgs"] Apr 22 14:23:49.986770 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:49.986757 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:50.070647 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.070616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4a98486-a759-4b82-a436-76232713ca74-cert\") pod \"kserve-controller-manager-66cf78b85b-sdmgs\" (UID: \"c4a98486-a759-4b82-a436-76232713ca74\") " pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:50.070765 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.070665 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4sfj\" (UniqueName: \"kubernetes.io/projected/c4a98486-a759-4b82-a436-76232713ca74-kube-api-access-j4sfj\") pod \"kserve-controller-manager-66cf78b85b-sdmgs\" (UID: \"c4a98486-a759-4b82-a436-76232713ca74\") " pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:50.172059 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.172033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4a98486-a759-4b82-a436-76232713ca74-cert\") pod \"kserve-controller-manager-66cf78b85b-sdmgs\" (UID: \"c4a98486-a759-4b82-a436-76232713ca74\") " pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:50.172163 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.172100 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4sfj\" (UniqueName: \"kubernetes.io/projected/c4a98486-a759-4b82-a436-76232713ca74-kube-api-access-j4sfj\") pod \"kserve-controller-manager-66cf78b85b-sdmgs\" (UID: \"c4a98486-a759-4b82-a436-76232713ca74\") " pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:50.174381 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.174358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4a98486-a759-4b82-a436-76232713ca74-cert\") pod \"kserve-controller-manager-66cf78b85b-sdmgs\" (UID: \"c4a98486-a759-4b82-a436-76232713ca74\") " pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:50.180295 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.180270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4sfj\" (UniqueName: \"kubernetes.io/projected/c4a98486-a759-4b82-a436-76232713ca74-kube-api-access-j4sfj\") pod \"kserve-controller-manager-66cf78b85b-sdmgs\" (UID: \"c4a98486-a759-4b82-a436-76232713ca74\") " pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:50.195071 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.195025 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:50.272507 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.272475 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhbks\" (UniqueName: \"kubernetes.io/projected/c4592166-9dd7-418d-b2fa-2db51e9b9852-kube-api-access-mhbks\") pod \"c4592166-9dd7-418d-b2fa-2db51e9b9852\" (UID: \"c4592166-9dd7-418d-b2fa-2db51e9b9852\") " Apr 22 14:23:50.272659 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.272523 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4592166-9dd7-418d-b2fa-2db51e9b9852-cert\") pod \"c4592166-9dd7-418d-b2fa-2db51e9b9852\" (UID: \"c4592166-9dd7-418d-b2fa-2db51e9b9852\") " Apr 22 14:23:50.274495 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.274459 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4592166-9dd7-418d-b2fa-2db51e9b9852-cert" (OuterVolumeSpecName: "cert") pod "c4592166-9dd7-418d-b2fa-2db51e9b9852" (UID: "c4592166-9dd7-418d-b2fa-2db51e9b9852"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:23:50.274600 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.274467 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4592166-9dd7-418d-b2fa-2db51e9b9852-kube-api-access-mhbks" (OuterVolumeSpecName: "kube-api-access-mhbks") pod "c4592166-9dd7-418d-b2fa-2db51e9b9852" (UID: "c4592166-9dd7-418d-b2fa-2db51e9b9852"). InnerVolumeSpecName "kube-api-access-mhbks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:50.333160 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.333126 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:50.373277 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.373244 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mhbks\" (UniqueName: \"kubernetes.io/projected/c4592166-9dd7-418d-b2fa-2db51e9b9852-kube-api-access-mhbks\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:23:50.373277 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.373275 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4592166-9dd7-418d-b2fa-2db51e9b9852-cert\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:23:50.441856 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.441824 2573 generic.go:358] "Generic (PLEG): container finished" podID="c4592166-9dd7-418d-b2fa-2db51e9b9852" containerID="72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141" exitCode=0 Apr 22 14:23:50.441994 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.441895 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" Apr 22 14:23:50.441994 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.441950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" event={"ID":"c4592166-9dd7-418d-b2fa-2db51e9b9852","Type":"ContainerDied","Data":"72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141"} Apr 22 14:23:50.441994 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.441987 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-qrqbq" event={"ID":"c4592166-9dd7-418d-b2fa-2db51e9b9852","Type":"ContainerDied","Data":"a134099e323d0e1df2b1a13d2707a90419fa603eeeea34d1e5e6dae4af94ca5f"} Apr 22 14:23:50.442108 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.442007 2573 scope.go:117] "RemoveContainer" containerID="72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141" Apr 22 14:23:50.450777 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.450757 2573 scope.go:117] "RemoveContainer" containerID="72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141" Apr 22 14:23:50.451151 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:23:50.451089 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141\": container with ID starting with 72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141 not found: ID does not exist" containerID="72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141" Apr 22 14:23:50.451151 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.451122 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141"} err="failed to get container status \"72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141\": rpc error: code = NotFound desc = could not find container \"72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141\": container with ID starting with 72e8301cf8909af8fddd4e815d0a528673ba7f96c22c120530a45707ec2f8141 not found: ID does not exist" Apr 22 14:23:50.453612 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.453555 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-sdmgs"] Apr 22 14:23:50.455664 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:23:50.455642 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4a98486_a759_4b82_a436_76232713ca74.slice/crio-a59154fd146d83ce859589314a9fc1bb2a6d744954765aeb98c6d37286acda8f WatchSource:0}: Error finding container a59154fd146d83ce859589314a9fc1bb2a6d744954765aeb98c6d37286acda8f: Status 404 returned error can't find the container with id a59154fd146d83ce859589314a9fc1bb2a6d744954765aeb98c6d37286acda8f Apr 22 14:23:50.468882 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.468862 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-qrqbq"] Apr 22 14:23:50.474156 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:50.474137 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-qrqbq"] Apr 22 14:23:51.087576 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.087497 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:51.087576 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.087537 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:51.091766 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.091745 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:51.447238 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.447209 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" event={"ID":"c4a98486-a759-4b82-a436-76232713ca74","Type":"ContainerStarted","Data":"3663b910601f0bdaf58779afb5108b53aa806ce1e04ba0656ef41544dfb3fc9d"} Apr 22 14:23:51.447238 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.447242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" event={"ID":"c4a98486-a759-4b82-a436-76232713ca74","Type":"ContainerStarted","Data":"a59154fd146d83ce859589314a9fc1bb2a6d744954765aeb98c6d37286acda8f"} Apr 22 14:23:51.447435 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.447407 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:23:51.450867 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.450846 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d9b896989-qnxsz" Apr 22 14:23:51.467349 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.467315 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" podStartSLOduration=2.115012541 podStartE2EDuration="2.467302886s" podCreationTimestamp="2026-04-22 14:23:49 +0000 UTC" firstStartedPulling="2026-04-22 14:23:50.456821374 +0000 UTC m=+515.254583973" lastFinishedPulling="2026-04-22 14:23:50.809111712 +0000 UTC m=+515.606874318" observedRunningTime="2026-04-22 14:23:51.465422101 +0000 UTC m=+516.263184721" watchObservedRunningTime="2026-04-22 14:23:51.467302886 +0000 UTC m=+516.265065506" Apr 22 14:23:51.529442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.529417 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c7d65d555-qgblc"] Apr 22 14:23:51.778607 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:23:51.778579 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4592166-9dd7-418d-b2fa-2db51e9b9852" path="/var/lib/kubelet/pods/c4592166-9dd7-418d-b2fa-2db51e9b9852/volumes" Apr 22 14:24:16.548144 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.548078 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c7d65d555-qgblc" podUID="a69bc128-590b-4933-aa63-ffa8ce995526" containerName="console" containerID="cri-o://959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba" gracePeriod=15 Apr 22 14:24:16.791801 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.791776 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c7d65d555-qgblc_a69bc128-590b-4933-aa63-ffa8ce995526/console/0.log" Apr 22 14:24:16.791941 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.791852 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:24:16.903280 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903209 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-oauth-serving-cert\") pod \"a69bc128-590b-4933-aa63-ffa8ce995526\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " Apr 22 14:24:16.903280 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903241 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-oauth-config\") pod \"a69bc128-590b-4933-aa63-ffa8ce995526\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " Apr 22 14:24:16.903280 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903266 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-serving-cert\") pod \"a69bc128-590b-4933-aa63-ffa8ce995526\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " Apr 22 14:24:16.903523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903350 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-trusted-ca-bundle\") pod \"a69bc128-590b-4933-aa63-ffa8ce995526\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " Apr 22 14:24:16.903523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903379 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d99bg\" (UniqueName: \"kubernetes.io/projected/a69bc128-590b-4933-aa63-ffa8ce995526-kube-api-access-d99bg\") pod \"a69bc128-590b-4933-aa63-ffa8ce995526\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " Apr 22 14:24:16.903523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903411 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-service-ca\") pod \"a69bc128-590b-4933-aa63-ffa8ce995526\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " Apr 22 14:24:16.903523 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903460 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-console-config\") pod \"a69bc128-590b-4933-aa63-ffa8ce995526\" (UID: \"a69bc128-590b-4933-aa63-ffa8ce995526\") " Apr 22 14:24:16.903754 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903720 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a69bc128-590b-4933-aa63-ffa8ce995526" (UID: "a69bc128-590b-4933-aa63-ffa8ce995526"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:24:16.903825 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903780 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a69bc128-590b-4933-aa63-ffa8ce995526" (UID: "a69bc128-590b-4933-aa63-ffa8ce995526"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:24:16.903912 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.903854 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-service-ca" (OuterVolumeSpecName: "service-ca") pod "a69bc128-590b-4933-aa63-ffa8ce995526" (UID: "a69bc128-590b-4933-aa63-ffa8ce995526"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:24:16.904033 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.904013 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-console-config" (OuterVolumeSpecName: "console-config") pod "a69bc128-590b-4933-aa63-ffa8ce995526" (UID: "a69bc128-590b-4933-aa63-ffa8ce995526"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:24:16.905413 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.905386 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a69bc128-590b-4933-aa63-ffa8ce995526" (UID: "a69bc128-590b-4933-aa63-ffa8ce995526"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:24:16.905520 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.905445 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69bc128-590b-4933-aa63-ffa8ce995526-kube-api-access-d99bg" (OuterVolumeSpecName: "kube-api-access-d99bg") pod "a69bc128-590b-4933-aa63-ffa8ce995526" (UID: "a69bc128-590b-4933-aa63-ffa8ce995526"). InnerVolumeSpecName "kube-api-access-d99bg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:24:16.905567 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:16.905520 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a69bc128-590b-4933-aa63-ffa8ce995526" (UID: "a69bc128-590b-4933-aa63-ffa8ce995526"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:24:17.004879 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.004850 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-console-config\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:17.004879 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.004874 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-oauth-serving-cert\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:17.004879 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.004885 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-oauth-config\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:17.005106 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.004894 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69bc128-590b-4933-aa63-ffa8ce995526-console-serving-cert\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:17.005106 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.004904 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-trusted-ca-bundle\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:17.005106 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.004912 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d99bg\" (UniqueName: \"kubernetes.io/projected/a69bc128-590b-4933-aa63-ffa8ce995526-kube-api-access-d99bg\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:17.005106 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.004920 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a69bc128-590b-4933-aa63-ffa8ce995526-service-ca\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:17.536815 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.536789 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c7d65d555-qgblc_a69bc128-590b-4933-aa63-ffa8ce995526/console/0.log" Apr 22 14:24:17.536999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.536831 2573 generic.go:358] "Generic (PLEG): container finished" podID="a69bc128-590b-4933-aa63-ffa8ce995526" containerID="959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba" exitCode=2 Apr 22 14:24:17.536999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.536860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7d65d555-qgblc" event={"ID":"a69bc128-590b-4933-aa63-ffa8ce995526","Type":"ContainerDied","Data":"959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba"} Apr 22 14:24:17.536999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.536889 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7d65d555-qgblc" Apr 22 14:24:17.536999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.536897 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7d65d555-qgblc" event={"ID":"a69bc128-590b-4933-aa63-ffa8ce995526","Type":"ContainerDied","Data":"a9f759706d05cb3fa273bac41dc8626be262334e080edfd743e6a7b3bbb0b11d"} Apr 22 14:24:17.536999 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.536912 2573 scope.go:117] "RemoveContainer" containerID="959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba" Apr 22 14:24:17.545199 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.545184 2573 scope.go:117] "RemoveContainer" containerID="959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba" Apr 22 14:24:17.545441 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:24:17.545425 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba\": container with ID starting with 959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba not found: ID does not exist" containerID="959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba" Apr 22 14:24:17.545518 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.545447 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba"} err="failed to get container status \"959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba\": rpc error: code = NotFound desc = could not find container \"959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba\": container with ID starting with 959bc419ebf44a193c44a90ae46f936c33deddfa0232010e4f27d6b4291adeba not found: ID does not exist" Apr 22 14:24:17.557131 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.557112 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c7d65d555-qgblc"] Apr 22 14:24:17.564038 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.564017 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c7d65d555-qgblc"] Apr 22 14:24:17.784052 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:17.784017 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69bc128-590b-4933-aa63-ffa8ce995526" path="/var/lib/kubelet/pods/a69bc128-590b-4933-aa63-ffa8ce995526/volumes" Apr 22 14:24:22.455014 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:22.454985 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-sdmgs" Apr 22 14:24:23.458071 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.458034 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-tdk8h"] Apr 22 14:24:23.458498 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.458387 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a69bc128-590b-4933-aa63-ffa8ce995526" containerName="console" Apr 22 14:24:23.458498 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.458398 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69bc128-590b-4933-aa63-ffa8ce995526" containerName="console" Apr 22 14:24:23.458498 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.458407 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4592166-9dd7-418d-b2fa-2db51e9b9852" containerName="manager" Apr 22 14:24:23.458498 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.458413 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4592166-9dd7-418d-b2fa-2db51e9b9852" containerName="manager" Apr 22 14:24:23.458498 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.458466 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a69bc128-590b-4933-aa63-ffa8ce995526" containerName="console" Apr 22 14:24:23.458498 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.458475 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4592166-9dd7-418d-b2fa-2db51e9b9852" containerName="manager" Apr 22 14:24:23.462938 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.462920 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:23.465606 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.465579 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-f59b4\"" Apr 22 14:24:23.466238 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.466215 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 14:24:23.469364 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.469344 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tdk8h"] Apr 22 14:24:23.560149 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.560111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-tls-certs\") pod \"model-serving-api-86f7b4b499-tdk8h\" (UID: \"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9\") " pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:23.560325 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.560169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s224q\" (UniqueName: \"kubernetes.io/projected/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-kube-api-access-s224q\") pod \"model-serving-api-86f7b4b499-tdk8h\" (UID: \"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9\") " pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:23.661604 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.661573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-tls-certs\") pod \"model-serving-api-86f7b4b499-tdk8h\" (UID: \"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9\") " pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:23.661775 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.661611 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s224q\" (UniqueName: \"kubernetes.io/projected/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-kube-api-access-s224q\") pod \"model-serving-api-86f7b4b499-tdk8h\" (UID: \"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9\") " pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:23.661775 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:24:23.661733 2573 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 22 14:24:23.661866 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:24:23.661804 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-tls-certs podName:9fdaa5b6-06fa-460e-83ed-e8418eee7ec9 nodeName:}" failed. No retries permitted until 2026-04-22 14:24:24.161785659 +0000 UTC m=+548.959548266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-tls-certs") pod "model-serving-api-86f7b4b499-tdk8h" (UID: "9fdaa5b6-06fa-460e-83ed-e8418eee7ec9") : secret "model-serving-api-tls" not found Apr 22 14:24:23.672479 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:23.672446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s224q\" (UniqueName: \"kubernetes.io/projected/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-kube-api-access-s224q\") pod \"model-serving-api-86f7b4b499-tdk8h\" (UID: \"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9\") " pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:24.166829 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:24.166790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-tls-certs\") pod \"model-serving-api-86f7b4b499-tdk8h\" (UID: \"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9\") " pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:24.169186 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:24.169161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdaa5b6-06fa-460e-83ed-e8418eee7ec9-tls-certs\") pod \"model-serving-api-86f7b4b499-tdk8h\" (UID: \"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9\") " pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:24.374570 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:24.374513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:24.497432 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:24.497298 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tdk8h"] Apr 22 14:24:24.500261 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:24:24.500232 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fdaa5b6_06fa_460e_83ed_e8418eee7ec9.slice/crio-1b7bed06a1631ad7058af5b0e741b7209b3826f7996f4932e09a400f2dabd88b WatchSource:0}: Error finding container 1b7bed06a1631ad7058af5b0e741b7209b3826f7996f4932e09a400f2dabd88b: Status 404 returned error can't find the container with id 1b7bed06a1631ad7058af5b0e741b7209b3826f7996f4932e09a400f2dabd88b Apr 22 14:24:24.562725 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:24.562683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tdk8h" event={"ID":"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9","Type":"ContainerStarted","Data":"1b7bed06a1631ad7058af5b0e741b7209b3826f7996f4932e09a400f2dabd88b"} Apr 22 14:24:26.572240 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:26.572207 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tdk8h" event={"ID":"9fdaa5b6-06fa-460e-83ed-e8418eee7ec9","Type":"ContainerStarted","Data":"d99ee039ab5a2f6b0d5b32d962f573abcf45ea78fce545dee8d62ab0c0d9ed46"} Apr 22 14:24:26.572608 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:26.572253 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:26.592378 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:26.592336 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-tdk8h" podStartSLOduration=2.512445654 podStartE2EDuration="3.592319503s" podCreationTimestamp="2026-04-22 14:24:23 +0000 UTC" firstStartedPulling="2026-04-22 14:24:24.502309706 +0000 UTC m=+549.300072309" lastFinishedPulling="2026-04-22 14:24:25.582183559 +0000 UTC m=+550.379946158" observedRunningTime="2026-04-22 14:24:26.591522934 +0000 UTC m=+551.389285565" watchObservedRunningTime="2026-04-22 14:24:26.592319503 +0000 UTC m=+551.390082126" Apr 22 14:24:37.580282 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:37.580255 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-tdk8h" Apr 22 14:24:50.420514 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.420482 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq"] Apr 22 14:24:50.423987 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.423968 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:50.426117 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.426081 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 14:24:50.426263 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.426137 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-m75pq\"" Apr 22 14:24:50.428849 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.428830 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq"] Apr 22 14:24:50.475352 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.475330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ae900436-d046-429d-933b-a8c79c38c1fd-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-z5rjq\" (UID: \"ae900436-d046-429d-933b-a8c79c38c1fd\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:50.475451 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.475388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd98h\" (UniqueName: \"kubernetes.io/projected/ae900436-d046-429d-933b-a8c79c38c1fd-kube-api-access-gd98h\") pod \"seaweedfs-tls-custom-ddd4dbfd-z5rjq\" (UID: \"ae900436-d046-429d-933b-a8c79c38c1fd\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:50.576442 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.576402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd98h\" (UniqueName: \"kubernetes.io/projected/ae900436-d046-429d-933b-a8c79c38c1fd-kube-api-access-gd98h\") pod \"seaweedfs-tls-custom-ddd4dbfd-z5rjq\" (UID: \"ae900436-d046-429d-933b-a8c79c38c1fd\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:50.576577 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.576472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ae900436-d046-429d-933b-a8c79c38c1fd-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-z5rjq\" (UID: \"ae900436-d046-429d-933b-a8c79c38c1fd\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:50.576805 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.576790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ae900436-d046-429d-933b-a8c79c38c1fd-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-z5rjq\" (UID: \"ae900436-d046-429d-933b-a8c79c38c1fd\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:50.584543 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.584520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd98h\" (UniqueName: \"kubernetes.io/projected/ae900436-d046-429d-933b-a8c79c38c1fd-kube-api-access-gd98h\") pod \"seaweedfs-tls-custom-ddd4dbfd-z5rjq\" (UID: \"ae900436-d046-429d-933b-a8c79c38c1fd\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:50.734712 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:50.734609 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:51.058717 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:51.058676 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq"] Apr 22 14:24:51.061171 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:24:51.061133 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae900436_d046_429d_933b_a8c79c38c1fd.slice/crio-f36e8b51424fbe043387c9fb87fc28fdeaaba0b721467a4d071cc56d00edc6fc WatchSource:0}: Error finding container f36e8b51424fbe043387c9fb87fc28fdeaaba0b721467a4d071cc56d00edc6fc: Status 404 returned error can't find the container with id f36e8b51424fbe043387c9fb87fc28fdeaaba0b721467a4d071cc56d00edc6fc Apr 22 14:24:51.659419 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:51.659327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" event={"ID":"ae900436-d046-429d-933b-a8c79c38c1fd","Type":"ContainerStarted","Data":"f36e8b51424fbe043387c9fb87fc28fdeaaba0b721467a4d071cc56d00edc6fc"} Apr 22 14:24:54.673366 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:54.673329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" event={"ID":"ae900436-d046-429d-933b-a8c79c38c1fd","Type":"ContainerStarted","Data":"347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc"} Apr 22 14:24:54.689213 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:54.689162 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" podStartSLOduration=1.908282184 podStartE2EDuration="4.689148368s" podCreationTimestamp="2026-04-22 14:24:50 +0000 UTC" firstStartedPulling="2026-04-22 14:24:51.062361005 +0000 UTC m=+575.860123607" lastFinishedPulling="2026-04-22 14:24:53.843227189 +0000 UTC m=+578.640989791" observedRunningTime="2026-04-22 14:24:54.687542611 +0000 UTC m=+579.485305232" watchObservedRunningTime="2026-04-22 14:24:54.689148368 +0000 UTC m=+579.486910989" Apr 22 14:24:56.100207 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:56.100171 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq"] Apr 22 14:24:56.680068 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:56.680028 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" podUID="ae900436-d046-429d-933b-a8c79c38c1fd" containerName="seaweedfs-tls-custom" containerID="cri-o://347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc" gracePeriod=30 Apr 22 14:24:57.911592 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:57.911569 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:58.043115 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.043085 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd98h\" (UniqueName: \"kubernetes.io/projected/ae900436-d046-429d-933b-a8c79c38c1fd-kube-api-access-gd98h\") pod \"ae900436-d046-429d-933b-a8c79c38c1fd\" (UID: \"ae900436-d046-429d-933b-a8c79c38c1fd\") " Apr 22 14:24:58.043269 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.043139 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ae900436-d046-429d-933b-a8c79c38c1fd-data\") pod \"ae900436-d046-429d-933b-a8c79c38c1fd\" (UID: \"ae900436-d046-429d-933b-a8c79c38c1fd\") " Apr 22 14:24:58.044379 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.044355 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae900436-d046-429d-933b-a8c79c38c1fd-data" (OuterVolumeSpecName: "data") pod "ae900436-d046-429d-933b-a8c79c38c1fd" (UID: "ae900436-d046-429d-933b-a8c79c38c1fd"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:24:58.045118 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.045099 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae900436-d046-429d-933b-a8c79c38c1fd-kube-api-access-gd98h" (OuterVolumeSpecName: "kube-api-access-gd98h") pod "ae900436-d046-429d-933b-a8c79c38c1fd" (UID: "ae900436-d046-429d-933b-a8c79c38c1fd"). InnerVolumeSpecName "kube-api-access-gd98h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:24:58.144303 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.144270 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gd98h\" (UniqueName: \"kubernetes.io/projected/ae900436-d046-429d-933b-a8c79c38c1fd-kube-api-access-gd98h\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:58.144303 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.144297 2573 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ae900436-d046-429d-933b-a8c79c38c1fd-data\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 14:24:58.687261 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.687222 2573 generic.go:358] "Generic (PLEG): container finished" podID="ae900436-d046-429d-933b-a8c79c38c1fd" containerID="347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc" exitCode=0 Apr 22 14:24:58.687406 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.687281 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" Apr 22 14:24:58.687406 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.687310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" event={"ID":"ae900436-d046-429d-933b-a8c79c38c1fd","Type":"ContainerDied","Data":"347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc"} Apr 22 14:24:58.687406 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.687348 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq" event={"ID":"ae900436-d046-429d-933b-a8c79c38c1fd","Type":"ContainerDied","Data":"f36e8b51424fbe043387c9fb87fc28fdeaaba0b721467a4d071cc56d00edc6fc"} Apr 22 14:24:58.687406 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.687364 2573 scope.go:117] "RemoveContainer" containerID="347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc" Apr 22 14:24:58.696782 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.696762 2573 scope.go:117] "RemoveContainer" containerID="347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc" Apr 22 14:24:58.697025 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:24:58.697006 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc\": container with ID starting with 347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc not found: ID does not exist" containerID="347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc" Apr 22 14:24:58.697063 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.697034 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc"} err="failed to get container status \"347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc\": rpc error: code = NotFound desc = could not find container \"347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc\": container with ID starting with 347064b8312c0ec0af511b4a0eb5460e91bebf9b361083884095e4378de06bfc not found: ID does not exist" Apr 22 14:24:58.709219 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.709196 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq"] Apr 22 14:24:58.713526 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:58.713504 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-z5rjq"] Apr 22 14:24:59.778208 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:24:59.778174 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae900436-d046-429d-933b-a8c79c38c1fd" path="/var/lib/kubelet/pods/ae900436-d046-429d-933b-a8c79c38c1fd/volumes" Apr 22 14:25:15.682859 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:25:15.682833 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:25:15.684726 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:25:15.684704 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:28:17.655978 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.655938 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm"] Apr 22 14:28:17.656451 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.656327 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae900436-d046-429d-933b-a8c79c38c1fd" containerName="seaweedfs-tls-custom" Apr 22 14:28:17.656451 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.656339 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae900436-d046-429d-933b-a8c79c38c1fd" containerName="seaweedfs-tls-custom" Apr 22 14:28:17.656451 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.656409 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae900436-d046-429d-933b-a8c79c38c1fd" containerName="seaweedfs-tls-custom" Apr 22 14:28:17.659135 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.659115 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" Apr 22 14:28:17.661180 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.661149 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 14:28:17.666527 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.666114 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm"] Apr 22 14:28:17.669280 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.669261 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" Apr 22 14:28:17.803330 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.803270 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm"] Apr 22 14:28:17.805600 ip-10-0-136-45 kubenswrapper[2573]: W0422 14:28:17.805574 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47d703f1_c3e3_4fe3_b847_f3a7a9ff8b58.slice/crio-a72ce88558ad98d29ed96a041b7dbba58d859dd45eb626af075f48eec66cba8f WatchSource:0}: Error finding container a72ce88558ad98d29ed96a041b7dbba58d859dd45eb626af075f48eec66cba8f: Status 404 returned error can't find the container with id a72ce88558ad98d29ed96a041b7dbba58d859dd45eb626af075f48eec66cba8f Apr 22 14:28:17.807148 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:17.807132 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:28:18.354900 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:18.354866 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" event={"ID":"47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58","Type":"ContainerStarted","Data":"a72ce88558ad98d29ed96a041b7dbba58d859dd45eb626af075f48eec66cba8f"} Apr 22 14:28:19.359462 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:19.359429 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" event={"ID":"47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58","Type":"ContainerStarted","Data":"7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9"} Apr 22 14:28:19.359896 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:19.359671 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" Apr 22 14:28:19.361266 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:19.361245 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" Apr 22 14:28:19.377920 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:28:19.377872 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" podStartSLOduration=1.276191205 podStartE2EDuration="2.377862675s" podCreationTimestamp="2026-04-22 14:28:17 +0000 UTC" firstStartedPulling="2026-04-22 14:28:17.807284079 +0000 UTC m=+782.605046681" lastFinishedPulling="2026-04-22 14:28:18.908955551 +0000 UTC m=+783.706718151" observedRunningTime="2026-04-22 14:28:19.375598778 +0000 UTC m=+784.173361397" watchObservedRunningTime="2026-04-22 14:28:19.377862675 +0000 UTC m=+784.175625374" Apr 22 14:29:42.753064 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:42.753022 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-8dwjm_47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58/kserve-container/0.log" Apr 22 14:29:42.896681 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:42.896654 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm"] Apr 22 14:29:43.645357 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:43.645319 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" podUID="47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58" containerName="kserve-container" containerID="cri-o://7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9" gracePeriod=30 Apr 22 14:29:43.878768 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:43.878744 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" Apr 22 14:29:44.649607 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.649577 2573 generic.go:358] "Generic (PLEG): container finished" podID="47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58" containerID="7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9" exitCode=2 Apr 22 14:29:44.649607 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.649610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" event={"ID":"47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58","Type":"ContainerDied","Data":"7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9"} Apr 22 14:29:44.649859 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.649630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" event={"ID":"47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58","Type":"ContainerDied","Data":"a72ce88558ad98d29ed96a041b7dbba58d859dd45eb626af075f48eec66cba8f"} Apr 22 14:29:44.649859 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.649631 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm" Apr 22 14:29:44.649859 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.649645 2573 scope.go:117] "RemoveContainer" containerID="7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9" Apr 22 14:29:44.658249 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.658228 2573 scope.go:117] "RemoveContainer" containerID="7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9" Apr 22 14:29:44.658466 ip-10-0-136-45 kubenswrapper[2573]: E0422 14:29:44.658449 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9\": container with ID starting with 7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9 not found: ID does not exist" containerID="7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9" Apr 22 14:29:44.658528 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.658473 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9"} err="failed to get container status \"7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9\": rpc error: code = NotFound desc = could not find container \"7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9\": container with ID starting with 7702a50fa0514e435348f0e772d9bd08c4ce66eb3b5931632f22edd2da379ae9 not found: ID does not exist" Apr 22 14:29:44.670490 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.670463 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm"] Apr 22 14:29:44.674313 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:44.674293 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8dwjm"] Apr 22 14:29:45.779390 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:29:45.779358 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58" path="/var/lib/kubelet/pods/47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58/volumes" Apr 22 14:30:15.708596 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:30:15.708522 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:30:15.710746 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:30:15.710725 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:35:15.736943 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:35:15.736861 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:35:15.739318 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:35:15.739298 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:40:15.762135 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:40:15.762110 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:40:15.764590 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:40:15.764567 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:45:15.786806 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:45:15.786777 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:45:15.791028 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:45:15.791007 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:50:15.813030 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:50:15.813002 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:50:15.817608 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:50:15.817587 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:55:15.837601 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:55:15.837562 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 14:55:15.841610 ip-10-0-136-45 kubenswrapper[2573]: I0422 14:55:15.841590 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:00:15.864430 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:00:15.864399 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:00:15.870078 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:00:15.870056 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:01:59.002252 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.002213 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl"] Apr 22 15:01:59.002852 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.002677 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58" containerName="kserve-container" Apr 22 15:01:59.002852 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.002708 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58" containerName="kserve-container" Apr 22 15:01:59.002852 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.002846 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="47d703f1-c3e3-4fe3-b847-f3a7a9ff8b58" containerName="kserve-container" Apr 22 15:01:59.005993 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.005975 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:01:59.008827 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.008809 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 15:01:59.019283 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.019253 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl"] Apr 22 15:01:59.101141 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.101098 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c50697d4-5503-41b8-89b2-09104b0c870f-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-jjmdl\" (UID: \"c50697d4-5503-41b8-89b2-09104b0c870f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:01:59.202436 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.202399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c50697d4-5503-41b8-89b2-09104b0c870f-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-jjmdl\" (UID: \"c50697d4-5503-41b8-89b2-09104b0c870f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:01:59.202854 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.202831 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c50697d4-5503-41b8-89b2-09104b0c870f-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-jjmdl\" (UID: \"c50697d4-5503-41b8-89b2-09104b0c870f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:01:59.316421 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.316384 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:01:59.456541 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.456507 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl"] Apr 22 15:01:59.459497 ip-10-0-136-45 kubenswrapper[2573]: W0422 15:01:59.459471 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50697d4_5503_41b8_89b2_09104b0c870f.slice/crio-55b91d5848fc0071977ca1c8e880859aa04cbfb0337cb1c62ff2b649f1f379d8 WatchSource:0}: Error finding container 55b91d5848fc0071977ca1c8e880859aa04cbfb0337cb1c62ff2b649f1f379d8: Status 404 returned error can't find the container with id 55b91d5848fc0071977ca1c8e880859aa04cbfb0337cb1c62ff2b649f1f379d8 Apr 22 15:01:59.461334 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:01:59.461317 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:02:00.381218 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:02:00.381151 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" event={"ID":"c50697d4-5503-41b8-89b2-09104b0c870f","Type":"ContainerStarted","Data":"55b91d5848fc0071977ca1c8e880859aa04cbfb0337cb1c62ff2b649f1f379d8"} Apr 22 15:02:05.400320 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:02:05.400279 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" event={"ID":"c50697d4-5503-41b8-89b2-09104b0c870f","Type":"ContainerStarted","Data":"24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec"} Apr 22 15:02:09.415121 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:02:09.415082 2573 generic.go:358] "Generic (PLEG): container finished" podID="c50697d4-5503-41b8-89b2-09104b0c870f" containerID="24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec" exitCode=0 Apr 22 15:02:09.415589 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:02:09.415159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" event={"ID":"c50697d4-5503-41b8-89b2-09104b0c870f","Type":"ContainerDied","Data":"24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec"} Apr 22 15:04:03.881907 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:03.881874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" event={"ID":"c50697d4-5503-41b8-89b2-09104b0c870f","Type":"ContainerStarted","Data":"ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69"} Apr 22 15:04:03.882324 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:03.882060 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:04:03.883121 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:03.883096 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 15:04:03.908175 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:03.908077 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" podStartSLOduration=1.9972375900000001 podStartE2EDuration="2m5.908060216s" podCreationTimestamp="2026-04-22 15:01:58 +0000 UTC" firstStartedPulling="2026-04-22 15:01:59.461450235 +0000 UTC m=+2804.259212835" lastFinishedPulling="2026-04-22 15:04:03.372272862 +0000 UTC m=+2928.170035461" observedRunningTime="2026-04-22 15:04:03.904400527 +0000 UTC m=+2928.702163149" watchObservedRunningTime="2026-04-22 15:04:03.908060216 +0000 UTC m=+2928.705822837" Apr 22 15:04:04.885612 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:04.885572 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 15:04:14.887100 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:14.887074 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:04:20.432616 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.432575 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl"] Apr 22 15:04:20.433056 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.432934 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" containerName="kserve-container" containerID="cri-o://ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69" gracePeriod=30 Apr 22 15:04:20.530127 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.530090 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl"] Apr 22 15:04:20.532799 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.532780 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:04:20.541739 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.541717 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl"] Apr 22 15:04:20.678928 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.678896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40c27250-e0eb-4365-b80c-8343c5f0e21e-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-7fjpl\" (UID: \"40c27250-e0eb-4365-b80c-8343c5f0e21e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:04:20.780067 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.780043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40c27250-e0eb-4365-b80c-8343c5f0e21e-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-7fjpl\" (UID: \"40c27250-e0eb-4365-b80c-8343c5f0e21e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:04:20.780406 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.780386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40c27250-e0eb-4365-b80c-8343c5f0e21e-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-7fjpl\" (UID: \"40c27250-e0eb-4365-b80c-8343c5f0e21e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:04:20.843702 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:20.843664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:04:21.040746 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:21.040721 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl"] Apr 22 15:04:21.042887 ip-10-0-136-45 kubenswrapper[2573]: W0422 15:04:21.042856 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40c27250_e0eb_4365_b80c_8343c5f0e21e.slice/crio-f9f30759ac316e5a310fa09091781beebe02d2bad09e586f1a0c4f82889b12d2 WatchSource:0}: Error finding container f9f30759ac316e5a310fa09091781beebe02d2bad09e586f1a0c4f82889b12d2: Status 404 returned error can't find the container with id f9f30759ac316e5a310fa09091781beebe02d2bad09e586f1a0c4f82889b12d2 Apr 22 15:04:21.942499 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:21.942468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" event={"ID":"40c27250-e0eb-4365-b80c-8343c5f0e21e","Type":"ContainerStarted","Data":"afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a"} Apr 22 15:04:21.942499 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:21.942500 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" event={"ID":"40c27250-e0eb-4365-b80c-8343c5f0e21e","Type":"ContainerStarted","Data":"f9f30759ac316e5a310fa09091781beebe02d2bad09e586f1a0c4f82889b12d2"} Apr 22 15:04:22.864424 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.864402 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:04:22.947422 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.947361 2573 generic.go:358] "Generic (PLEG): container finished" podID="c50697d4-5503-41b8-89b2-09104b0c870f" containerID="ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69" exitCode=0 Apr 22 15:04:22.947726 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.947420 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" Apr 22 15:04:22.947726 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.947438 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" event={"ID":"c50697d4-5503-41b8-89b2-09104b0c870f","Type":"ContainerDied","Data":"ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69"} Apr 22 15:04:22.947726 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.947470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl" event={"ID":"c50697d4-5503-41b8-89b2-09104b0c870f","Type":"ContainerDied","Data":"55b91d5848fc0071977ca1c8e880859aa04cbfb0337cb1c62ff2b649f1f379d8"} Apr 22 15:04:22.947726 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.947485 2573 scope.go:117] "RemoveContainer" containerID="ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69" Apr 22 15:04:22.955107 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.955092 2573 scope.go:117] "RemoveContainer" containerID="24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec" Apr 22 15:04:22.961748 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.961728 2573 scope.go:117] "RemoveContainer" containerID="ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69" Apr 22 15:04:22.961977 ip-10-0-136-45 kubenswrapper[2573]: E0422 15:04:22.961958 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69\": container with ID starting with ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69 not found: ID does not exist" containerID="ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69" Apr 22 15:04:22.962029 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.961985 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69"} err="failed to get container status \"ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69\": rpc error: code = NotFound desc = could not find container \"ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69\": container with ID starting with ba0e8701d0a56b092dbc0e75cc33fa2b46376a297986f683e37afff635313a69 not found: ID does not exist" Apr 22 15:04:22.962029 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.962002 2573 scope.go:117] "RemoveContainer" containerID="24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec" Apr 22 15:04:22.962222 ip-10-0-136-45 kubenswrapper[2573]: E0422 15:04:22.962207 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec\": container with ID starting with 24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec not found: ID does not exist" containerID="24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec" Apr 22 15:04:22.962261 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:22.962230 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec"} err="failed to get container status \"24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec\": rpc error: code = NotFound desc = could not find container \"24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec\": container with ID starting with 24c4a660c0bccb9a87210d022c864079aadc6283b92335b5131c4354f3beb3ec not found: ID does not exist" Apr 22 15:04:23.000551 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:23.000531 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c50697d4-5503-41b8-89b2-09104b0c870f-kserve-provision-location\") pod \"c50697d4-5503-41b8-89b2-09104b0c870f\" (UID: \"c50697d4-5503-41b8-89b2-09104b0c870f\") " Apr 22 15:04:23.000995 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:23.000974 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c50697d4-5503-41b8-89b2-09104b0c870f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c50697d4-5503-41b8-89b2-09104b0c870f" (UID: "c50697d4-5503-41b8-89b2-09104b0c870f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:04:23.101343 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:23.101318 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c50697d4-5503-41b8-89b2-09104b0c870f-kserve-provision-location\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 15:04:23.269387 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:23.269355 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl"] Apr 22 15:04:23.273146 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:23.273127 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjmdl"] Apr 22 15:04:23.778355 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:23.778325 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" path="/var/lib/kubelet/pods/c50697d4-5503-41b8-89b2-09104b0c870f/volumes" Apr 22 15:04:24.955904 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:24.955872 2573 generic.go:358] "Generic (PLEG): container finished" podID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerID="afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a" exitCode=0 Apr 22 15:04:24.956225 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:24.955945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" event={"ID":"40c27250-e0eb-4365-b80c-8343c5f0e21e","Type":"ContainerDied","Data":"afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a"} Apr 22 15:04:49.049727 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:49.049675 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" event={"ID":"40c27250-e0eb-4365-b80c-8343c5f0e21e","Type":"ContainerStarted","Data":"062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08"} Apr 22 15:04:49.050062 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:49.050028 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:04:49.051066 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:49.051044 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 22 15:04:49.068548 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:49.068510 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podStartSLOduration=5.089730429 podStartE2EDuration="29.068500314s" podCreationTimestamp="2026-04-22 15:04:20 +0000 UTC" firstStartedPulling="2026-04-22 15:04:24.957234652 +0000 UTC m=+2949.754997251" lastFinishedPulling="2026-04-22 15:04:48.936004511 +0000 UTC m=+2973.733767136" observedRunningTime="2026-04-22 15:04:49.065670608 +0000 UTC m=+2973.863433228" watchObservedRunningTime="2026-04-22 15:04:49.068500314 +0000 UTC m=+2973.866262934" Apr 22 15:04:50.053788 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:04:50.053752 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 22 15:05:00.054399 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:00.054360 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 22 15:05:10.053953 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:10.053914 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 22 15:05:15.890837 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:15.890808 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:05:15.896152 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:15.896133 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:05:20.054215 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:20.054178 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 22 15:05:30.054485 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:30.054442 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 22 15:05:40.054488 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:40.054446 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 22 15:05:50.054903 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:50.054870 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:05:50.652099 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:50.652063 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl"] Apr 22 15:05:50.652590 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:50.652487 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" containerID="cri-o://062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08" gracePeriod=30 Apr 22 15:05:54.196040 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.196014 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:05:54.278844 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.278814 2573 generic.go:358] "Generic (PLEG): container finished" podID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerID="062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08" exitCode=0 Apr 22 15:05:54.279003 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.278877 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" Apr 22 15:05:54.279003 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.278896 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" event={"ID":"40c27250-e0eb-4365-b80c-8343c5f0e21e","Type":"ContainerDied","Data":"062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08"} Apr 22 15:05:54.279003 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.278935 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl" event={"ID":"40c27250-e0eb-4365-b80c-8343c5f0e21e","Type":"ContainerDied","Data":"f9f30759ac316e5a310fa09091781beebe02d2bad09e586f1a0c4f82889b12d2"} Apr 22 15:05:54.279003 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.278951 2573 scope.go:117] "RemoveContainer" containerID="062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08" Apr 22 15:05:54.286976 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.286944 2573 scope.go:117] "RemoveContainer" containerID="afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a" Apr 22 15:05:54.294249 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.294206 2573 scope.go:117] "RemoveContainer" containerID="062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08" Apr 22 15:05:54.294485 ip-10-0-136-45 kubenswrapper[2573]: E0422 15:05:54.294464 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08\": container with ID starting with 062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08 not found: ID does not exist" containerID="062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08" Apr 22 15:05:54.294531 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.294506 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08"} err="failed to get container status \"062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08\": rpc error: code = NotFound desc = could not find container \"062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08\": container with ID starting with 062fbf18a93e1fcd879d18959b1139037434c9feb1a3e5f3986289fd4b931d08 not found: ID does not exist" Apr 22 15:05:54.294582 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.294530 2573 scope.go:117] "RemoveContainer" containerID="afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a" Apr 22 15:05:54.294794 ip-10-0-136-45 kubenswrapper[2573]: E0422 15:05:54.294776 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a\": container with ID starting with afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a not found: ID does not exist" containerID="afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a" Apr 22 15:05:54.294851 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.294801 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a"} err="failed to get container status \"afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a\": rpc error: code = NotFound desc = could not find container \"afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a\": container with ID starting with afe63b95cfbe1a6b9521dc76122a1f57424c75439ea289e5cb533e0ce260d84a not found: ID does not exist" Apr 22 15:05:54.320647 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.320625 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40c27250-e0eb-4365-b80c-8343c5f0e21e-kserve-provision-location\") pod \"40c27250-e0eb-4365-b80c-8343c5f0e21e\" (UID: \"40c27250-e0eb-4365-b80c-8343c5f0e21e\") " Apr 22 15:05:54.320934 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.320911 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c27250-e0eb-4365-b80c-8343c5f0e21e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "40c27250-e0eb-4365-b80c-8343c5f0e21e" (UID: "40c27250-e0eb-4365-b80c-8343c5f0e21e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:05:54.421583 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.421554 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40c27250-e0eb-4365-b80c-8343c5f0e21e-kserve-provision-location\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 15:05:54.599704 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.599663 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl"] Apr 22 15:05:54.601441 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:54.601420 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-7fjpl"] Apr 22 15:05:55.778180 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:05:55.778142 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" path="/var/lib/kubelet/pods/40c27250-e0eb-4365-b80c-8343c5f0e21e/volumes" Apr 22 15:07:11.065408 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.065363 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf"] Apr 22 15:07:11.065971 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.065952 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" containerName="storage-initializer" Apr 22 15:07:11.066026 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.065974 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" containerName="storage-initializer" Apr 22 15:07:11.066026 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.065989 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" containerName="kserve-container" Apr 22 15:07:11.066026 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.065998 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" containerName="kserve-container" Apr 22 15:07:11.066026 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.066021 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="storage-initializer" Apr 22 15:07:11.066229 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.066032 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="storage-initializer" Apr 22 15:07:11.066229 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.066043 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" Apr 22 15:07:11.066229 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.066052 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" Apr 22 15:07:11.066229 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.066147 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c50697d4-5503-41b8-89b2-09104b0c870f" containerName="kserve-container" Apr 22 15:07:11.066229 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.066161 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="40c27250-e0eb-4365-b80c-8343c5f0e21e" containerName="kserve-container" Apr 22 15:07:11.069624 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.069601 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:07:11.071611 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.071587 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 15:07:11.077468 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.077353 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf"] Apr 22 15:07:11.194215 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.194172 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12850613-59cc-4168-a68b-e505279be7a0-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-6pwkf\" (UID: \"12850613-59cc-4168-a68b-e505279be7a0\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:07:11.294909 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.294863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12850613-59cc-4168-a68b-e505279be7a0-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-6pwkf\" (UID: \"12850613-59cc-4168-a68b-e505279be7a0\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:07:11.295258 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.295237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12850613-59cc-4168-a68b-e505279be7a0-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-6pwkf\" (UID: \"12850613-59cc-4168-a68b-e505279be7a0\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:07:11.382256 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.382168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:07:11.507463 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.507431 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf"] Apr 22 15:07:11.510284 ip-10-0-136-45 kubenswrapper[2573]: W0422 15:07:11.510246 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12850613_59cc_4168_a68b_e505279be7a0.slice/crio-42dc6d6469b522ff1f034c07cb7a55493bea3f8f65cb755732719287d7bb267f WatchSource:0}: Error finding container 42dc6d6469b522ff1f034c07cb7a55493bea3f8f65cb755732719287d7bb267f: Status 404 returned error can't find the container with id 42dc6d6469b522ff1f034c07cb7a55493bea3f8f65cb755732719287d7bb267f Apr 22 15:07:11.512090 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.512072 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:07:11.559658 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:11.559628 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" event={"ID":"12850613-59cc-4168-a68b-e505279be7a0","Type":"ContainerStarted","Data":"42dc6d6469b522ff1f034c07cb7a55493bea3f8f65cb755732719287d7bb267f"} Apr 22 15:07:12.564676 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:12.564638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" event={"ID":"12850613-59cc-4168-a68b-e505279be7a0","Type":"ContainerStarted","Data":"fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e"} Apr 22 15:07:15.576888 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:15.576800 2573 generic.go:358] "Generic (PLEG): container finished" podID="12850613-59cc-4168-a68b-e505279be7a0" containerID="fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e" exitCode=0 Apr 22 15:07:15.576888 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:15.576845 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" event={"ID":"12850613-59cc-4168-a68b-e505279be7a0","Type":"ContainerDied","Data":"fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e"} Apr 22 15:07:16.582434 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:16.582396 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" event={"ID":"12850613-59cc-4168-a68b-e505279be7a0","Type":"ContainerStarted","Data":"a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886"} Apr 22 15:07:16.582952 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:16.582708 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:07:16.584141 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:16.584115 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 22 15:07:16.599576 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:16.599518 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podStartSLOduration=5.599503267 podStartE2EDuration="5.599503267s" podCreationTimestamp="2026-04-22 15:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:07:16.597446735 +0000 UTC m=+3121.395209358" watchObservedRunningTime="2026-04-22 15:07:16.599503267 +0000 UTC m=+3121.397265888" Apr 22 15:07:17.586529 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:17.586490 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 22 15:07:27.587014 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:27.586969 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 22 15:07:37.586955 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:37.586870 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 22 15:07:47.586784 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:47.586742 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 22 15:07:57.587138 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:07:57.587091 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 22 15:08:07.587411 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:07.587369 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 22 15:08:17.587935 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:17.587902 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:08:21.211408 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:21.211363 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf"] Apr 22 15:08:21.211891 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:21.211603 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" containerID="cri-o://a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886" gracePeriod=30 Apr 22 15:08:24.960263 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:24.960227 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:08:25.052950 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.052919 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12850613-59cc-4168-a68b-e505279be7a0-kserve-provision-location\") pod \"12850613-59cc-4168-a68b-e505279be7a0\" (UID: \"12850613-59cc-4168-a68b-e505279be7a0\") " Apr 22 15:08:25.053240 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.053215 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12850613-59cc-4168-a68b-e505279be7a0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12850613-59cc-4168-a68b-e505279be7a0" (UID: "12850613-59cc-4168-a68b-e505279be7a0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:08:25.153567 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.153529 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12850613-59cc-4168-a68b-e505279be7a0-kserve-provision-location\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 15:08:25.822513 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.822474 2573 generic.go:358] "Generic (PLEG): container finished" podID="12850613-59cc-4168-a68b-e505279be7a0" containerID="a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886" exitCode=0 Apr 22 15:08:25.822703 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.822552 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" Apr 22 15:08:25.822703 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.822566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" event={"ID":"12850613-59cc-4168-a68b-e505279be7a0","Type":"ContainerDied","Data":"a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886"} Apr 22 15:08:25.822703 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.822617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf" event={"ID":"12850613-59cc-4168-a68b-e505279be7a0","Type":"ContainerDied","Data":"42dc6d6469b522ff1f034c07cb7a55493bea3f8f65cb755732719287d7bb267f"} Apr 22 15:08:25.822703 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.822640 2573 scope.go:117] "RemoveContainer" containerID="a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886" Apr 22 15:08:25.832801 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.832606 2573 scope.go:117] "RemoveContainer" containerID="fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e" Apr 22 15:08:25.840443 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.840414 2573 scope.go:117] "RemoveContainer" containerID="a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886" Apr 22 15:08:25.840904 ip-10-0-136-45 kubenswrapper[2573]: E0422 15:08:25.840881 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886\": container with ID starting with a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886 not found: ID does not exist" containerID="a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886" Apr 22 15:08:25.841009 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.840940 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886"} err="failed to get container status \"a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886\": rpc error: code = NotFound desc = could not find container \"a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886\": container with ID starting with a60e3ddc832b9cdf19885c0f76e7ad83f5a78a5e05e8957b07fc1fa245b24886 not found: ID does not exist" Apr 22 15:08:25.841009 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.840968 2573 scope.go:117] "RemoveContainer" containerID="fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e" Apr 22 15:08:25.841269 ip-10-0-136-45 kubenswrapper[2573]: E0422 15:08:25.841243 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e\": container with ID starting with fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e not found: ID does not exist" containerID="fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e" Apr 22 15:08:25.841337 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.841272 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e"} err="failed to get container status \"fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e\": rpc error: code = NotFound desc = could not find container \"fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e\": container with ID starting with fecc062ef032fb498a9ca90836b20b774637d016733297f8e974b3a2f0f6582e not found: ID does not exist" Apr 22 15:08:25.841985 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.841967 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf"] Apr 22 15:08:25.850294 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:25.850269 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-6pwkf"] Apr 22 15:08:27.778334 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:08:27.778299 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12850613-59cc-4168-a68b-e505279be7a0" path="/var/lib/kubelet/pods/12850613-59cc-4168-a68b-e505279be7a0/volumes" Apr 22 15:09:11.439870 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.439838 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f"] Apr 22 15:09:11.442134 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.440289 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="storage-initializer" Apr 22 15:09:11.442134 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.440301 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="storage-initializer" Apr 22 15:09:11.442134 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.440309 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" Apr 22 15:09:11.442134 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.440315 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" Apr 22 15:09:11.442134 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.440371 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="12850613-59cc-4168-a68b-e505279be7a0" containerName="kserve-container" Apr 22 15:09:11.443085 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.443068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:09:11.446468 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.446446 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 15:09:11.454293 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.454270 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f"] Apr 22 15:09:11.529561 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.529528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9602220c-072e-43df-9f14-733c64e836c2-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-mrx6f\" (UID: \"9602220c-072e-43df-9f14-733c64e836c2\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:09:11.630249 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.630209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9602220c-072e-43df-9f14-733c64e836c2-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-mrx6f\" (UID: \"9602220c-072e-43df-9f14-733c64e836c2\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:09:11.630603 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.630582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9602220c-072e-43df-9f14-733c64e836c2-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-mrx6f\" (UID: \"9602220c-072e-43df-9f14-733c64e836c2\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:09:11.755308 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.755226 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:09:11.876542 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.876519 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f"] Apr 22 15:09:11.878588 ip-10-0-136-45 kubenswrapper[2573]: W0422 15:09:11.878559 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9602220c_072e_43df_9f14_733c64e836c2.slice/crio-dcfef517c8e81b49ff0e227e4eff897e8819b463f86d1cba0cbc3fb3f6753eb6 WatchSource:0}: Error finding container dcfef517c8e81b49ff0e227e4eff897e8819b463f86d1cba0cbc3fb3f6753eb6: Status 404 returned error can't find the container with id dcfef517c8e81b49ff0e227e4eff897e8819b463f86d1cba0cbc3fb3f6753eb6 Apr 22 15:09:11.984920 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.984885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" event={"ID":"9602220c-072e-43df-9f14-733c64e836c2","Type":"ContainerStarted","Data":"82a119f6acd85ddfbf136d1ed9c626763cf912238921c75f731258fb84155654"} Apr 22 15:09:11.985071 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:11.984924 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" event={"ID":"9602220c-072e-43df-9f14-733c64e836c2","Type":"ContainerStarted","Data":"dcfef517c8e81b49ff0e227e4eff897e8819b463f86d1cba0cbc3fb3f6753eb6"} Apr 22 15:09:16.000607 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:16.000576 2573 generic.go:358] "Generic (PLEG): container finished" podID="9602220c-072e-43df-9f14-733c64e836c2" containerID="82a119f6acd85ddfbf136d1ed9c626763cf912238921c75f731258fb84155654" exitCode=0 Apr 22 15:09:16.000939 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:16.000648 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" event={"ID":"9602220c-072e-43df-9f14-733c64e836c2","Type":"ContainerDied","Data":"82a119f6acd85ddfbf136d1ed9c626763cf912238921c75f731258fb84155654"} Apr 22 15:09:17.005773 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:17.005740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" event={"ID":"9602220c-072e-43df-9f14-733c64e836c2","Type":"ContainerStarted","Data":"c874128976d07483d06a43bd5e08d1e16a8533007f985e7dd2a1308b5180fbd4"} Apr 22 15:09:17.006168 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:17.006024 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:09:17.007388 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:17.007365 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 15:09:17.041093 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:17.041047 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podStartSLOduration=6.041033679 podStartE2EDuration="6.041033679s" podCreationTimestamp="2026-04-22 15:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:09:17.040883176 +0000 UTC m=+3241.838645798" watchObservedRunningTime="2026-04-22 15:09:17.041033679 +0000 UTC m=+3241.838796300" Apr 22 15:09:18.009570 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:18.009535 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 15:09:28.009760 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:28.009714 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 15:09:38.009708 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:38.009640 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 15:09:48.009998 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:48.009954 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 15:09:58.009854 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:09:58.009813 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 15:10:08.010153 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:08.010112 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 15:10:15.917202 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:15.917170 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:10:15.923973 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:15.923944 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:10:18.010583 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:18.010556 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:10:21.565516 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:21.565483 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f"] Apr 22 15:10:21.566004 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:21.565786 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" containerID="cri-o://c874128976d07483d06a43bd5e08d1e16a8533007f985e7dd2a1308b5180fbd4" gracePeriod=30 Apr 22 15:10:25.254670 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:25.254636 2573 generic.go:358] "Generic (PLEG): container finished" podID="9602220c-072e-43df-9f14-733c64e836c2" containerID="c874128976d07483d06a43bd5e08d1e16a8533007f985e7dd2a1308b5180fbd4" exitCode=0 Apr 22 15:10:25.255040 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:25.254714 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" event={"ID":"9602220c-072e-43df-9f14-733c64e836c2","Type":"ContainerDied","Data":"c874128976d07483d06a43bd5e08d1e16a8533007f985e7dd2a1308b5180fbd4"} Apr 22 15:10:25.345801 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:25.345778 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:10:25.479786 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:25.479670 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9602220c-072e-43df-9f14-733c64e836c2-kserve-provision-location\") pod \"9602220c-072e-43df-9f14-733c64e836c2\" (UID: \"9602220c-072e-43df-9f14-733c64e836c2\") " Apr 22 15:10:25.480055 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:25.480028 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9602220c-072e-43df-9f14-733c64e836c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9602220c-072e-43df-9f14-733c64e836c2" (UID: "9602220c-072e-43df-9f14-733c64e836c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:10:25.581151 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:25.581120 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9602220c-072e-43df-9f14-733c64e836c2-kserve-provision-location\") on node \"ip-10-0-136-45.ec2.internal\" DevicePath \"\"" Apr 22 15:10:26.259676 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:26.259594 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" Apr 22 15:10:26.260118 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:26.259590 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f" event={"ID":"9602220c-072e-43df-9f14-733c64e836c2","Type":"ContainerDied","Data":"dcfef517c8e81b49ff0e227e4eff897e8819b463f86d1cba0cbc3fb3f6753eb6"} Apr 22 15:10:26.260118 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:26.259727 2573 scope.go:117] "RemoveContainer" containerID="c874128976d07483d06a43bd5e08d1e16a8533007f985e7dd2a1308b5180fbd4" Apr 22 15:10:26.267639 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:26.267619 2573 scope.go:117] "RemoveContainer" containerID="82a119f6acd85ddfbf136d1ed9c626763cf912238921c75f731258fb84155654" Apr 22 15:10:26.275232 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:26.275214 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f"] Apr 22 15:10:26.279204 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:26.279183 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-mrx6f"] Apr 22 15:10:27.778129 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:10:27.778089 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9602220c-072e-43df-9f14-733c64e836c2" path="/var/lib/kubelet/pods/9602220c-072e-43df-9f14-733c64e836c2/volumes" Apr 22 15:15:15.942731 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:15:15.942677 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:15:15.949788 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:15:15.949765 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:16:28.893018 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.892984 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zm2sq/must-gather-fltcd"] Apr 22 15:16:28.893435 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.893331 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="storage-initializer" Apr 22 15:16:28.893435 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.893342 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="storage-initializer" Apr 22 15:16:28.893435 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.893368 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" Apr 22 15:16:28.893435 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.893374 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" Apr 22 15:16:28.893435 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.893429 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9602220c-072e-43df-9f14-733c64e836c2" containerName="kserve-container" Apr 22 15:16:28.896404 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.896388 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zm2sq/must-gather-fltcd" Apr 22 15:16:28.898359 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.898335 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zm2sq\"/\"openshift-service-ca.crt\"" Apr 22 15:16:28.898888 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.898867 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zm2sq\"/\"default-dockercfg-jkn8s\"" Apr 22 15:16:28.898888 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.898881 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zm2sq\"/\"kube-root-ca.crt\"" Apr 22 15:16:28.902212 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:28.902179 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zm2sq/must-gather-fltcd"] Apr 22 15:16:29.033434 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.033406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e7037d4-8776-46a8-8daf-a1f99367994c-must-gather-output\") pod \"must-gather-fltcd\" (UID: \"0e7037d4-8776-46a8-8daf-a1f99367994c\") " pod="openshift-must-gather-zm2sq/must-gather-fltcd" Apr 22 15:16:29.033587 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.033456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxvzw\" (UniqueName: \"kubernetes.io/projected/0e7037d4-8776-46a8-8daf-a1f99367994c-kube-api-access-xxvzw\") pod \"must-gather-fltcd\" (UID: \"0e7037d4-8776-46a8-8daf-a1f99367994c\") " pod="openshift-must-gather-zm2sq/must-gather-fltcd" Apr 22 15:16:29.134528 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.134500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxvzw\" (UniqueName: \"kubernetes.io/projected/0e7037d4-8776-46a8-8daf-a1f99367994c-kube-api-access-xxvzw\") pod \"must-gather-fltcd\" (UID: \"0e7037d4-8776-46a8-8daf-a1f99367994c\") " pod="openshift-must-gather-zm2sq/must-gather-fltcd" Apr 22 15:16:29.134674 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.134556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e7037d4-8776-46a8-8daf-a1f99367994c-must-gather-output\") pod \"must-gather-fltcd\" (UID: \"0e7037d4-8776-46a8-8daf-a1f99367994c\") " pod="openshift-must-gather-zm2sq/must-gather-fltcd" Apr 22 15:16:29.134847 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.134830 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e7037d4-8776-46a8-8daf-a1f99367994c-must-gather-output\") pod \"must-gather-fltcd\" (UID: \"0e7037d4-8776-46a8-8daf-a1f99367994c\") " pod="openshift-must-gather-zm2sq/must-gather-fltcd" Apr 22 15:16:29.142286 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.142264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxvzw\" (UniqueName: \"kubernetes.io/projected/0e7037d4-8776-46a8-8daf-a1f99367994c-kube-api-access-xxvzw\") pod \"must-gather-fltcd\" (UID: \"0e7037d4-8776-46a8-8daf-a1f99367994c\") " pod="openshift-must-gather-zm2sq/must-gather-fltcd" Apr 22 15:16:29.212309 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.212245 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zm2sq/must-gather-fltcd" Apr 22 15:16:29.331860 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.331839 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zm2sq/must-gather-fltcd"] Apr 22 15:16:29.334177 ip-10-0-136-45 kubenswrapper[2573]: W0422 15:16:29.334151 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7037d4_8776_46a8_8daf_a1f99367994c.slice/crio-be50a827b278fdcc6ed178a5574106d78b1c008754de738b35d218a8ace29a55 WatchSource:0}: Error finding container be50a827b278fdcc6ed178a5574106d78b1c008754de738b35d218a8ace29a55: Status 404 returned error can't find the container with id be50a827b278fdcc6ed178a5574106d78b1c008754de738b35d218a8ace29a55 Apr 22 15:16:29.335946 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.335924 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:16:29.503210 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:29.503147 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/must-gather-fltcd" event={"ID":"0e7037d4-8776-46a8-8daf-a1f99367994c","Type":"ContainerStarted","Data":"be50a827b278fdcc6ed178a5574106d78b1c008754de738b35d218a8ace29a55"} Apr 22 15:16:30.507666 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:30.507638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/must-gather-fltcd" event={"ID":"0e7037d4-8776-46a8-8daf-a1f99367994c","Type":"ContainerStarted","Data":"59267e72764caf36ef413c263f083b130197e1313e2509ebf4ef394ea1ba5136"} Apr 22 15:16:31.513783 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:31.513738 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/must-gather-fltcd" event={"ID":"0e7037d4-8776-46a8-8daf-a1f99367994c","Type":"ContainerStarted","Data":"44d1379ef3128c748f86ab491976935b8fc6ed92f80a517f8ecb87df2b71c4e6"} Apr 22 15:16:31.531547 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:31.531492 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zm2sq/must-gather-fltcd" podStartSLOduration=2.4917248880000002 podStartE2EDuration="3.531478441s" podCreationTimestamp="2026-04-22 15:16:28 +0000 UTC" firstStartedPulling="2026-04-22 15:16:29.336074383 +0000 UTC m=+3674.133836983" lastFinishedPulling="2026-04-22 15:16:30.375827916 +0000 UTC m=+3675.173590536" observedRunningTime="2026-04-22 15:16:31.529088642 +0000 UTC m=+3676.326851265" watchObservedRunningTime="2026-04-22 15:16:31.531478441 +0000 UTC m=+3676.329241062" Apr 22 15:16:31.974069 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:31.974037 2573 ???:1] "http: TLS handshake error from 10.0.142.195:58170: EOF" Apr 22 15:16:31.978052 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:31.978024 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2wkqh_d4bc0a54-af51-4f91-af28-fff86847e8d6/global-pull-secret-syncer/0.log" Apr 22 15:16:32.164388 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:32.164349 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cvntg_0f603cf6-03b7-4b6b-aa45-5650e8076be3/konnectivity-agent/0.log" Apr 22 15:16:32.265628 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:32.265528 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-45.ec2.internal_ff0e9bc99f6afd5adc8facc8707d2004/haproxy/0.log" Apr 22 15:16:35.038751 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.038716 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4413c3ef-9f87-47de-bc69-50c496ba4b87/alertmanager/0.log" Apr 22 15:16:35.068445 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.068409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4413c3ef-9f87-47de-bc69-50c496ba4b87/config-reloader/0.log" Apr 22 15:16:35.092831 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.092800 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4413c3ef-9f87-47de-bc69-50c496ba4b87/kube-rbac-proxy-web/0.log" Apr 22 15:16:35.125105 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.125060 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4413c3ef-9f87-47de-bc69-50c496ba4b87/kube-rbac-proxy/0.log" Apr 22 15:16:35.151830 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.151800 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4413c3ef-9f87-47de-bc69-50c496ba4b87/kube-rbac-proxy-metric/0.log" Apr 22 15:16:35.179067 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.179035 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4413c3ef-9f87-47de-bc69-50c496ba4b87/prom-label-proxy/0.log" Apr 22 15:16:35.212233 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.212195 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4413c3ef-9f87-47de-bc69-50c496ba4b87/init-config-reloader/0.log" Apr 22 15:16:35.269170 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.268050 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-fv6qj_84e5fe39-c177-4874-a08b-cec368549879/cluster-monitoring-operator/0.log" Apr 22 15:16:35.328031 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.327927 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v7xr8_c820e8c6-75f5-4e79-9f8c-04e662cce3e8/kube-state-metrics/0.log" Apr 22 15:16:35.355911 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.355886 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v7xr8_c820e8c6-75f5-4e79-9f8c-04e662cce3e8/kube-rbac-proxy-main/0.log" Apr 22 15:16:35.381488 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.381464 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v7xr8_c820e8c6-75f5-4e79-9f8c-04e662cce3e8/kube-rbac-proxy-self/0.log" Apr 22 15:16:35.417428 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.417404 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-66fbd5dbbd-nwpr6_df709a0c-327c-4e03-976f-3eae1b7859fa/metrics-server/0.log" Apr 22 15:16:35.698991 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.698919 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kngm6_cfe6ba1b-0e9f-4a07-ae97-e903f41c1194/node-exporter/0.log" Apr 22 15:16:35.720434 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.720413 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kngm6_cfe6ba1b-0e9f-4a07-ae97-e903f41c1194/kube-rbac-proxy/0.log" Apr 22 15:16:35.745351 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:35.745309 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kngm6_cfe6ba1b-0e9f-4a07-ae97-e903f41c1194/init-textfile/0.log" Apr 22 15:16:36.101354 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.101322 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vc7fj_37142c62-18c9-4bbe-b554-97c41a82c03e/prometheus-operator/0.log" Apr 22 15:16:36.120023 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.119996 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vc7fj_37142c62-18c9-4bbe-b554-97c41a82c03e/kube-rbac-proxy/0.log" Apr 22 15:16:36.150167 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.150135 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-2l4km_968a0cf8-f5d3-4918-9501-d37c536bbccf/prometheus-operator-admission-webhook/0.log" Apr 22 15:16:36.279317 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.279286 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bd7945d5d-b2m8s_bd638ff7-6608-49c3-8b03-48587688258d/thanos-query/0.log" Apr 22 15:16:36.305896 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.305866 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bd7945d5d-b2m8s_bd638ff7-6608-49c3-8b03-48587688258d/kube-rbac-proxy-web/0.log" Apr 22 15:16:36.334470 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.334437 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bd7945d5d-b2m8s_bd638ff7-6608-49c3-8b03-48587688258d/kube-rbac-proxy/0.log" Apr 22 15:16:36.357995 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.357920 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bd7945d5d-b2m8s_bd638ff7-6608-49c3-8b03-48587688258d/prom-label-proxy/0.log" Apr 22 15:16:36.381656 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.381627 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bd7945d5d-b2m8s_bd638ff7-6608-49c3-8b03-48587688258d/kube-rbac-proxy-rules/0.log" Apr 22 15:16:36.406975 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:36.406945 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bd7945d5d-b2m8s_bd638ff7-6608-49c3-8b03-48587688258d/kube-rbac-proxy-metrics/0.log" Apr 22 15:16:38.560045 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:38.560020 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d9b896989-qnxsz_79717c4c-c53e-450a-b5ff-ce14751d7d43/console/0.log" Apr 22 15:16:38.598640 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:38.598570 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-x676c_cf11cd5f-5f43-4b8f-9de3-5d91eabe9b5c/download-server/0.log" Apr 22 15:16:39.007989 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.007913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-pp5sm_3b3c692c-1e00-4b52-b92e-ee544698dbd8/volume-data-source-validator/0.log" Apr 22 15:16:39.224896 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.224858 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9"] Apr 22 15:16:39.230282 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.230253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.241150 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.241123 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9"] Apr 22 15:16:39.338020 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.337983 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-lib-modules\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.338189 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.338037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-podres\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.338189 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.338082 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbv7h\" (UniqueName: \"kubernetes.io/projected/6c308054-079b-4a57-815c-79d9b4fd3a23-kube-api-access-wbv7h\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.338189 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.338112 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-proc\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.338189 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.338179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-sys\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.438831 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.438788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-sys\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.439029 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.438911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-lib-modules\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.439029 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.438955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-podres\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.439029 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.439001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbv7h\" (UniqueName: \"kubernetes.io/projected/6c308054-079b-4a57-815c-79d9b4fd3a23-kube-api-access-wbv7h\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.439200 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.439032 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-proc\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.439200 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.439130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-proc\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.439200 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.439130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-sys\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.439200 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.439142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-podres\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.439200 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.439166 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c308054-079b-4a57-815c-79d9b4fd3a23-lib-modules\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.448025 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.447989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbv7h\" (UniqueName: \"kubernetes.io/projected/6c308054-079b-4a57-815c-79d9b4fd3a23-kube-api-access-wbv7h\") pod \"perf-node-gather-daemonset-dfxg9\" (UID: \"6c308054-079b-4a57-815c-79d9b4fd3a23\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.543060 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.543026 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:39.687804 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.687768 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9"] Apr 22 15:16:39.693897 ip-10-0-136-45 kubenswrapper[2573]: W0422 15:16:39.693861 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6c308054_079b_4a57_815c_79d9b4fd3a23.slice/crio-f490568349ed21d2e0bbab8e9377d0dd90ed528fa8363688a7974569633c3594 WatchSource:0}: Error finding container f490568349ed21d2e0bbab8e9377d0dd90ed528fa8363688a7974569633c3594: Status 404 returned error can't find the container with id f490568349ed21d2e0bbab8e9377d0dd90ed528fa8363688a7974569633c3594 Apr 22 15:16:39.747415 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.747393 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-b7rqq_cab656ec-aafc-44d5-bcf4-998f7334f612/dns/0.log" Apr 22 15:16:39.769809 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.769785 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-b7rqq_cab656ec-aafc-44d5-bcf4-998f7334f612/kube-rbac-proxy/0.log" Apr 22 15:16:39.930376 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:39.930307 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b256j_04c378f7-681d-435b-8a89-47348177d100/dns-node-resolver/0.log" Apr 22 15:16:40.528987 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:40.528956 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vm6ts_b6c47887-6a82-4f97-8cf9-82dee1757b34/node-ca/0.log" Apr 22 15:16:40.552598 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:40.552550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" event={"ID":"6c308054-079b-4a57-815c-79d9b4fd3a23","Type":"ContainerStarted","Data":"0d70f6d99e4e8c55c3df58085b2bc7654b61b9b1aeb2c3da30cdef59c3694877"} Apr 22 15:16:40.552855 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:40.552833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" event={"ID":"6c308054-079b-4a57-815c-79d9b4fd3a23","Type":"ContainerStarted","Data":"f490568349ed21d2e0bbab8e9377d0dd90ed528fa8363688a7974569633c3594"} Apr 22 15:16:40.553017 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:40.553002 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:40.569454 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:40.569399 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" podStartSLOduration=1.569380987 podStartE2EDuration="1.569380987s" podCreationTimestamp="2026-04-22 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:16:40.568495137 +0000 UTC m=+3685.366257760" watchObservedRunningTime="2026-04-22 15:16:40.569380987 +0000 UTC m=+3685.367143609" Apr 22 15:16:41.779081 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:41.779052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kmxlt_6425daea-30dc-424d-a7ff-d63860240eee/serve-healthcheck-canary/0.log" Apr 22 15:16:42.165972 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:42.165902 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-2zk62_9de5847f-e323-4cd6-9aef-fde65fdaa5e2/insights-operator/0.log" Apr 22 15:16:42.165972 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:42.165931 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-2zk62_9de5847f-e323-4cd6-9aef-fde65fdaa5e2/insights-operator/1.log" Apr 22 15:16:42.371836 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:42.371807 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9nv9_0d4a5b6f-e316-4ade-9baf-933024dc955e/kube-rbac-proxy/0.log" Apr 22 15:16:42.410513 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:42.410477 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9nv9_0d4a5b6f-e316-4ade-9baf-933024dc955e/exporter/0.log" Apr 22 15:16:42.445632 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:42.445557 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9nv9_0d4a5b6f-e316-4ade-9baf-933024dc955e/extractor/0.log" Apr 22 15:16:44.687947 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:44.687917 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-66cf78b85b-sdmgs_c4a98486-a759-4b82-a436-76232713ca74/manager/0.log" Apr 22 15:16:44.735589 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:44.735564 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-tdk8h_9fdaa5b6-06fa-460e-83ed-e8418eee7ec9/server/0.log" Apr 22 15:16:47.570668 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:47.570641 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-dfxg9" Apr 22 15:16:50.922787 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:50.922762 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbj7k_7da714b4-f642-4144-b84b-7f4d1f4cbe60/kube-multus-additional-cni-plugins/0.log" Apr 22 15:16:50.952302 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:50.952229 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbj7k_7da714b4-f642-4144-b84b-7f4d1f4cbe60/egress-router-binary-copy/0.log" Apr 22 15:16:50.978164 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:50.978134 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbj7k_7da714b4-f642-4144-b84b-7f4d1f4cbe60/cni-plugins/0.log" Apr 22 15:16:51.001291 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:51.001265 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbj7k_7da714b4-f642-4144-b84b-7f4d1f4cbe60/bond-cni-plugin/0.log" Apr 22 15:16:51.022408 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:51.022382 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbj7k_7da714b4-f642-4144-b84b-7f4d1f4cbe60/routeoverride-cni/0.log" Apr 22 15:16:51.044573 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:51.044538 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbj7k_7da714b4-f642-4144-b84b-7f4d1f4cbe60/whereabouts-cni-bincopy/0.log" Apr 22 15:16:51.066059 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:51.066029 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zbj7k_7da714b4-f642-4144-b84b-7f4d1f4cbe60/whereabouts-cni/0.log" Apr 22 15:16:51.100913 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:51.100864 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdw65_3885c220-9472-43c1-825a-2352438bbb35/kube-multus/0.log" Apr 22 15:16:51.207779 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:51.207698 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qh8tk_6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d/network-metrics-daemon/0.log" Apr 22 15:16:51.231617 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:51.231581 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qh8tk_6b80d7aa-1899-4bc8-94fb-e670ad1c3b3d/kube-rbac-proxy/0.log" Apr 22 15:16:52.436166 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.436133 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-controller/0.log" Apr 22 15:16:52.461090 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.461065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/0.log" Apr 22 15:16:52.478758 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.478731 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovn-acl-logging/1.log" Apr 22 15:16:52.505428 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.505396 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/kube-rbac-proxy-node/0.log" Apr 22 15:16:52.528993 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.528958 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:16:52.586818 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.586786 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/northd/0.log" Apr 22 15:16:52.624454 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.624425 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/nbdb/0.log" Apr 22 15:16:52.646622 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.646537 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/sbdb/0.log" Apr 22 15:16:52.760766 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:52.760725 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t88vk_79402404-45d1-4ad9-b411-5640bfc88875/ovnkube-controller/0.log" Apr 22 15:16:54.093913 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:54.093882 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-8tn6g_dd765af3-73fe-4f78-9672-7301cb5a0352/check-endpoints/0.log" Apr 22 15:16:54.118254 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:54.118225 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mj694_083554f0-d10f-417b-ac2a-68e07b68b98b/network-check-target-container/0.log" Apr 22 15:16:55.230044 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:55.230002 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-l2j5r_992d1820-8cec-4643-a6fc-96f13a95fd10/iptables-alerter/0.log" Apr 22 15:16:55.864265 ip-10-0-136-45 kubenswrapper[2573]: I0422 15:16:55.864229 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5hdwm_26e11385-11b5-468d-8f37-f0a6251cf9f8/tuned/0.log"