Apr 17 07:48:35.128654 ip-10-0-130-28 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:48:35.128666 ip-10-0-130-28 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:48:35.128676 ip-10-0-130-28 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:48:35.128923 ip-10-0-130-28 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:48:45.225233 ip-10-0-130-28 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:48:45.225251 ip-10-0-130-28 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d00a38c56a2f4a6db71366c5a6984d44 -- Apr 17 07:51:21.074554 ip-10-0-130-28 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:51:21.557115 ip-10-0-130-28 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:21.557115 ip-10-0-130-28 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:51:21.557115 ip-10-0-130-28 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:21.557115 ip-10-0-130-28 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:51:21.557115 ip-10-0-130-28 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:21.557911 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.557821 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:51:21.560975 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.560959 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.560977 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.560983 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.560987 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.560990 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.560993 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.560995 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.560999 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561001 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561003 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561006 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561009 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561011 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561014 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561016 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561019 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:21.561015 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561022 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561025 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561028 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561030 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561033 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561035 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561038 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561040 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561042 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561045 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561053 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561056 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561059 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561061 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561064 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561066 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561069 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561072 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561074 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561076 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:21.561421 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561079 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561081 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561084 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561087 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561089 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561092 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561094 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561097 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561099 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561101 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561104 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561106 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561108 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561111 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561114 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561117 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561119 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561122 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561125 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561127 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:21.562017 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561129 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561132 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561134 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561140 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561144 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561147 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561149 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561152 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561155 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561157 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561160 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561162 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561164 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561167 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561170 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561173 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561175 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561177 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561180 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:21.562535 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561183 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561185 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561187 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561190 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561192 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561195 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561197 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561202 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561205 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561207 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561210 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561590 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561595 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561598 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561601 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561603 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561607 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561609 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561612 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:21.562986 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561615 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561617 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561620 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561623 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561625 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561628 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561631 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561633 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561635 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561638 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561641 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561643 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561646 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561648 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561650 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561653 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561655 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561658 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561660 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561663 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:21.563456 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561666 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561669 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561672 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561674 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561676 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561679 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561682 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561685 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561688 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561692 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561695 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561698 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561700 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561703 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561705 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561708 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561710 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561713 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561715 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561718 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:21.563980 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561720 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561723 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561725 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561728 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561730 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561733 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561735 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561738 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561741 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561743 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561745 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561748 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561751 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561753 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561756 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561759 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561761 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561764 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561766 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561769 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:21.564839 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561771 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561773 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561776 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561778 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561781 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561783 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561785 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561788 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561790 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561793 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561797 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561801 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561804 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561807 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561810 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561812 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561815 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.561818 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563041 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563050 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:51:21.565422 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563057 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563062 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563066 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563069 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563074 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563079 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563082 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563085 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563089 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563092 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563095 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563098 2567 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563101 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563104 2567 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563106 2567 flags.go:64] FLAG: --cloud-config="" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563109 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563113 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563116 2567 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563119 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563122 2567 flags.go:64] FLAG: --config-dir="" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563125 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563128 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563132 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563135 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:51:21.565941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563138 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563141 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563145 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563148 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563153 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563156 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563159 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563163 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563167 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563169 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563173 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563176 2567 flags.go:64] FLAG: --enable-server="true" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563178 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563183 2567 flags.go:64] FLAG: --event-burst="100" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563186 2567 flags.go:64] FLAG: --event-qps="50" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563189 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563192 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563195 2567 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563199 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563201 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563204 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563207 2567 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563210 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563213 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563216 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:51:21.566542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563220 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563223 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563226 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563229 2567 flags.go:64] FLAG: --feature-gates="" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563233 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563236 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563239 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563242 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563244 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563247 2567 flags.go:64] FLAG: --help="false" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563254 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-130-28.ec2.internal" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563257 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563260 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563263 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563266 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563269 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563272 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563275 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563278 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563281 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563297 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563301 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563304 2567 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563307 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:51:21.567147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563310 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563313 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563316 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563319 2567 flags.go:64] FLAG: --lock-file="" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563322 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563324 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563327 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563333 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563336 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563339 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563342 2567 flags.go:64] FLAG: --logging-format="text" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563345 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563348 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563351 2567 flags.go:64] FLAG: --manifest-url="" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563354 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563358 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563361 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563365 2567 flags.go:64] FLAG: --max-pods="110" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563369 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563372 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563374 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563378 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563381 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563383 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563386 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:51:21.567778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563394 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563396 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563400 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563404 2567 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563407 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563412 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563415 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563419 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563422 2567 flags.go:64] FLAG: --port="10250" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563425 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563427 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-010f71b0a41a1fbbf" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563430 2567 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563433 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563436 2567 flags.go:64] FLAG: --register-node="true" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563439 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563442 2567 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563445 2567 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563451 2567 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563455 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563457 2567 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563461 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563472 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563476 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563479 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563482 2567 flags.go:64] FLAG: --runonce="false" Apr 17 07:51:21.568394 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563485 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563488 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563491 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563494 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563497 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563499 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563502 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563505 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563508 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563511 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563514 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563517 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563520 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563523 2567 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563525 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563531 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563534 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563536 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563540 2567 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563543 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563546 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563549 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563551 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563554 2567 flags.go:64] FLAG: --v="2" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563559 2567 flags.go:64] FLAG: --version="false" Apr 17 07:51:21.568980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563563 2567 flags.go:64] FLAG: --vmodule="" Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563567 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.563570 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563668 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563671 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563674 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563677 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563679 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563683 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563686 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563688 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563692 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563694 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563697 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563700 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563702 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563704 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563707 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563710 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563713 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:21.569579 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563717 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563720 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563724 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563728 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563731 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563734 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563736 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563739 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563741 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563744 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563746 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563748 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563751 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563790 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563851 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563857 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563861 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563866 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563871 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:21.570067 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563876 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563881 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563885 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563890 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563894 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563899 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563903 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563913 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.563917 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564424 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564449 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564456 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564463 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564468 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564474 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564479 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564485 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564489 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564494 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564499 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:21.570567 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564504 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564516 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564520 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564524 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564528 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564533 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564537 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564541 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564545 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564550 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564554 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564558 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564565 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564569 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564578 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564582 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564587 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564591 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564595 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564599 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:21.571087 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564603 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564608 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564613 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564618 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564623 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564627 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564631 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564639 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564644 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.564648 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:21.571660 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.565615 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:21.573635 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.573518 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:51:21.573670 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.573639 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:51:21.573697 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573688 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:21.573697 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573694 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:21.573697 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573698 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573703 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573706 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573709 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573712 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573715 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573717 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573720 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573723 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573725 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573728 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573731 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573734 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573737 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573739 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573742 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573744 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573747 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573749 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573752 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:21.573781 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573754 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573757 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573759 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573762 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573765 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573767 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573769 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573772 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573774 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573777 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573779 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573781 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573784 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573788 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573791 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573794 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573797 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573799 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573802 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:21.574269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573804 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573807 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573809 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573812 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573814 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573816 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573819 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573822 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573824 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573827 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573829 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573831 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573833 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573836 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573838 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573842 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573844 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573847 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573850 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:21.574756 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573854 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573857 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573860 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573863 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573866 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573869 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573871 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573874 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573876 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573879 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573881 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573884 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573886 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573889 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573891 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573894 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573896 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573899 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573901 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573904 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:21.575210 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573906 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573909 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573912 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573914 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573916 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.573919 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.573924 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574016 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574020 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574024 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574027 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574030 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574033 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574035 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574038 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574041 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:21.575730 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574043 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574046 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574048 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574051 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574054 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574057 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574059 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574062 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574064 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574067 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574069 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574072 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574074 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574076 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574079 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574081 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574084 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574086 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574088 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:21.576117 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574091 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574093 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574096 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574098 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574101 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574104 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574107 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574110 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574112 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574115 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574117 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574120 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574122 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574124 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574127 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574129 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574132 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574134 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574137 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574139 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:21.576602 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574141 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574144 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574146 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574149 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574151 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574153 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574156 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574158 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574160 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574163 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574165 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574168 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574170 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574173 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574175 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574177 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574180 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574182 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574187 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574191 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:21.577085 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574193 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574195 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574198 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574200 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574203 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574205 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574207 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574210 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574212 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574215 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574217 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574221 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574224 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574226 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574229 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574232 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574235 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:21.577637 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:21.574237 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:21.578054 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.574241 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:21.578054 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.575009 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:51:21.578054 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.577060 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:51:21.578054 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.577973 2567 server.go:1019] "Starting client certificate rotation" Apr 17 07:51:21.578264 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.578076 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:21.578264 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.578130 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:21.604165 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.604138 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:21.606690 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.606669 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:21.625279 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.625258 2567 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:51:21.632038 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.632019 2567 log.go:25] "Validated CRI v1 image API" Apr 17 07:51:21.633309 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.633270 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:51:21.634464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.634449 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:21.637303 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.637227 2567 fs.go:135] Filesystem UUIDs: map[256cff7e-5ae8-4b22-99ef-23c7ad7139f4:/dev/nvme0n1p4 6fd1f69d-ada0-4679-bf42-6af2ceaa08d0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 07:51:21.637408 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.637303 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:51:21.643834 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.643720 2567 manager.go:217] Machine: {Timestamp:2026-04-17 07:51:21.642072557 +0000 UTC m=+0.445023290 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099598 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24a180a7b6f0b286f4d82bc24b1063 SystemUUID:ec24a180-a7b6-f0b2-86f4-d82bc24b1063 BootID:d00a38c5-6a2f-4a6d-b713-66c5a6984d44 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c5:ac:43:1f:71 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c5:ac:43:1f:71 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:c7:95:13:8a:43 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:51:21.643834 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.643824 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:51:21.643966 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.643912 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:51:21.647493 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.647462 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:51:21.647636 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.647495 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-28.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:51:21.647683 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.647646 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:51:21.647683 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.647655 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:51:21.647683 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.647668 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:21.648316 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.648306 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:21.649650 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.649640 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:21.649768 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.649759 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:51:21.651731 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.651716 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-llmd6" Apr 17 07:51:21.652149 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.652139 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:51:21.652178 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.652155 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:51:21.652178 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.652167 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:51:21.652178 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.652177 2567 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:51:21.652306 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.652185 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:51:21.653833 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.653821 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:21.653894 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.653839 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:21.657213 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.657184 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:51:21.658558 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.658545 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:51:21.659350 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.659332 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-llmd6" Apr 17 07:51:21.660567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660555 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:51:21.660610 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660572 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:51:21.660610 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660578 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:51:21.660610 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660583 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:51:21.660610 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660589 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:51:21.660610 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660594 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:51:21.660610 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660600 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:51:21.660764 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660620 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:51:21.660764 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660628 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:51:21.660764 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660634 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:51:21.660764 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660651 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:51:21.660764 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.660659 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:51:21.661389 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.661367 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:51:21.661389 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.661388 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:51:21.665378 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.665362 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:51:21.665443 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.665424 2567 server.go:1295] "Started kubelet" Apr 17 07:51:21.665617 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.665554 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:51:21.665718 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.665650 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:51:21.665756 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.665716 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:51:21.666485 ip-10-0-130-28 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:51:21.667445 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.667429 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:51:21.668143 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.668127 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:51:21.668599 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.668584 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:21.670558 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.670543 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:21.672308 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.672273 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-28.ec2.internal" not found Apr 17 07:51:21.675542 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:21.675517 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 07:51:21.676113 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.676084 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:51:21.676194 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.676136 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:21.678208 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.678185 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:51:21.678208 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.678191 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:51:21.678358 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.678205 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:51:21.678358 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:21.678244 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-28.ec2.internal\" not found" Apr 17 07:51:21.678457 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.678415 2567 factory.go:55] Registering systemd factory Apr 17 07:51:21.678457 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.678420 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:51:21.678457 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.678429 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:51:21.678457 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.678435 2567 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:51:21.678636 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.678508 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:21.679127 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.679109 2567 factory.go:153] Registering CRI-O factory Apr 17 07:51:21.679213 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.679131 2567 factory.go:223] Registration of the crio container factory successfully Apr 17 07:51:21.679213 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.679185 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:51:21.679213 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.679207 2567 factory.go:103] Registering Raw factory Apr 17 07:51:21.679384 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.679223 2567 manager.go:1196] Started watching for new ooms in manager Apr 17 07:51:21.679718 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.679706 2567 manager.go:319] Starting recovery of all containers Apr 17 07:51:21.681372 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:21.681344 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-28.ec2.internal\" not found" node="ip-10-0-130-28.ec2.internal" Apr 17 07:51:21.686124 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.686082 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:51:21.689201 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.689067 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-28.ec2.internal" not found Apr 17 07:51:21.689993 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.689977 2567 manager.go:324] Recovery completed Apr 17 07:51:21.694478 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.694463 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:21.696132 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.696119 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-28.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:21.696194 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.696148 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-28.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:21.696194 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.696158 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-28.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:21.696660 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.696648 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:51:21.696660 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.696660 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:51:21.696763 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.696680 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:21.698773 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.698762 2567 policy_none.go:49] "None policy: Start" Apr 17 07:51:21.698812 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.698778 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:51:21.698812 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.698787 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:51:21.733551 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.733536 2567 manager.go:341] "Starting Device Plugin manager" Apr 17 07:51:21.733634 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:21.733574 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:51:21.733634 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.733586 2567 server.go:85] "Starting device plugin registration server" Apr 17 07:51:21.733820 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.733809 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:51:21.733876 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.733821 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:51:21.733985 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.733953 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:51:21.734040 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.734019 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:51:21.734040 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.734028 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:51:21.734972 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:21.734948 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:51:21.735064 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:21.734989 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-28.ec2.internal\" not found" Apr 17 07:51:21.749377 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.749360 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-28.ec2.internal" not found Apr 17 07:51:21.777943 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.777922 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:51:21.778035 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.777953 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:51:21.778035 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.777989 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:51:21.778035 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.777998 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:51:21.778146 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:21.778082 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:51:21.780069 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.780051 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:21.834155 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.834106 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:21.835244 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.835229 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-28.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:21.835340 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.835258 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-28.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:21.835340 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.835268 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-28.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:21.835340 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.835311 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-28.ec2.internal" Apr 17 07:51:21.849133 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.849111 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-28.ec2.internal" Apr 17 07:51:21.849218 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:21.849133 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-28.ec2.internal\": node \"ip-10-0-130-28.ec2.internal\" not found" Apr 17 07:51:21.878790 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.878764 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal"] Apr 17 07:51:21.881137 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.881121 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" Apr 17 07:51:21.881744 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.881490 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:21.916973 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.916955 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" Apr 17 07:51:21.921403 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.921388 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:21.928389 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.928374 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:21.930666 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:21.930649 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:22.079344 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.079281 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1260d269b19f56cf5f4a135125663f49-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal\" (UID: \"1260d269b19f56cf5f4a135125663f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.079533 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.079357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1260d269b19f56cf5f4a135125663f49-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal\" (UID: \"1260d269b19f56cf5f4a135125663f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.079533 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.079429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90295ff52276f4357ebdbd2fe3bf0ff4-config\") pod \"kube-apiserver-proxy-ip-10-0-130-28.ec2.internal\" (UID: \"90295ff52276f4357ebdbd2fe3bf0ff4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.180125 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.180037 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1260d269b19f56cf5f4a135125663f49-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal\" (UID: \"1260d269b19f56cf5f4a135125663f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.180125 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.180081 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90295ff52276f4357ebdbd2fe3bf0ff4-config\") pod \"kube-apiserver-proxy-ip-10-0-130-28.ec2.internal\" (UID: \"90295ff52276f4357ebdbd2fe3bf0ff4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.180125 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.180102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1260d269b19f56cf5f4a135125663f49-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal\" (UID: \"1260d269b19f56cf5f4a135125663f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.180361 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.180154 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1260d269b19f56cf5f4a135125663f49-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal\" (UID: \"1260d269b19f56cf5f4a135125663f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.180361 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.180157 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90295ff52276f4357ebdbd2fe3bf0ff4-config\") pod \"kube-apiserver-proxy-ip-10-0-130-28.ec2.internal\" (UID: \"90295ff52276f4357ebdbd2fe3bf0ff4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.180361 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.180155 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1260d269b19f56cf5f4a135125663f49-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal\" (UID: \"1260d269b19f56cf5f4a135125663f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.231226 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.231184 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.233656 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.233641 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" Apr 17 07:51:22.577959 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.577867 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:51:22.578652 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.578022 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:22.578652 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.578029 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:22.578652 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.578053 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:22.653350 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.653319 2567 apiserver.go:52] "Watching apiserver" Apr 17 07:51:22.660755 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.660722 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:51:22.661586 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.661559 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:46:21 +0000 UTC" deadline="2028-01-19 21:03:06.589202562 +0000 UTC" Apr 17 07:51:22.661645 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.661591 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15421h11m43.927618641s" Apr 17 07:51:22.662529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.662505 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-nnkhx","openshift-network-diagnostics/network-check-target-gz9xl","openshift-network-operator/iptables-alerter-zn97z","openshift-ovn-kubernetes/ovnkube-node-4l4mp","kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8","openshift-dns/node-resolver-4dz7x","openshift-multus/multus-additional-cni-plugins-qgq6m","openshift-multus/multus-grgq6","kube-system/konnectivity-agent-vpzvb","openshift-cluster-node-tuning-operator/tuned-cqr65","openshift-image-registry/node-ca-ks9m9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal"] Apr 17 07:51:22.665998 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.665978 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:22.666108 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:22.666065 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:22.667964 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.667941 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:22.668051 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:22.668025 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:22.668096 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.668056 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.670374 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.670343 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:51:22.670374 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.670357 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.670550 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.670451 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.670550 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.670468 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xrt5x\"" Apr 17 07:51:22.672553 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.672537 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.674371 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.674355 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.674452 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.674385 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.674737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.674721 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:51:22.674961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.674947 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:51:22.675825 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.675805 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.675890 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.675854 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:51:22.675890 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.675881 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:51:22.676004 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.675894 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.676004 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.675931 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gp54w\"" Apr 17 07:51:22.676387 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.676367 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:22.676508 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.676488 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.676508 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.676500 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:51:22.676617 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.676508 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-n2pqw\"" Apr 17 07:51:22.676617 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.676514 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-v8vrc\"" Apr 17 07:51:22.676709 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.676664 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.677569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.677546 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.677667 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.677638 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.677732 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.677676 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.678484 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.678471 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:51:22.679454 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.679429 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.680752 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.680731 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.680851 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.680833 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gfkzv\"" Apr 17 07:51:22.680907 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.680880 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:51:22.681418 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.681396 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:51:22.681498 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.681465 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.681938 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.681919 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fv5cs\"" Apr 17 07:51:22.682235 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682204 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.682359 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682271 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-socket-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.682359 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682353 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-k8s-cni-cncf-io\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.682448 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682399 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-hostroot\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.682482 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682442 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-kubelet\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.682482 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682478 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-system-cni-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.682581 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682498 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-kubelet\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.682821 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682523 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qnz\" (UniqueName: \"kubernetes.io/projected/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-kube-api-access-x9qnz\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.682883 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682848 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-os-release\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.682926 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682915 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:51:22.682975 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682900 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-ovn\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.682975 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.682954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-iptables-alerter-script\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.683054 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683000 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-cnibin\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683054 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683045 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-netns\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683129 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683068 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovnkube-script-lib\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683129 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhrr\" (UniqueName: \"kubernetes.io/projected/bb277a7c-b922-4c78-a4fd-5882a862b97a-kube-api-access-vjhrr\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.683129 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683118 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.683237 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683139 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-slash\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683237 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683198 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-cni-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683237 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683215 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-socket-dir-parent\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683373 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683254 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cni-binary-copy\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.683373 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683268 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.683373 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683301 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.683373 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683326 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-env-overrides\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683373 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683349 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovn-node-metrics-cert\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683373 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683362 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-cni-bin\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbd6\" (UniqueName: \"kubernetes.io/projected/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-kube-api-access-spbd6\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683393 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x86r\" (UniqueName: \"kubernetes.io/projected/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-kube-api-access-5x86r\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:22.683569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683433 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-etc-selinux\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.683569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683445 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:22.683569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683479 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-kube-api-access-cj7n4\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.683569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-daemon-config\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-run-netns\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683568 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb277a7c-b922-4c78-a4fd-5882a862b97a-tmp-dir\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683628 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-cni-bin\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683665 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-conf-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683691 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-log-socket\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683714 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-os-release\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efad7b51-5842-4aff-abb9-6379ecca5cc4-cni-binary-copy\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683761 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-node-log\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683784 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-run-ovn-kubernetes\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683817 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-etc-kubernetes\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683855 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xk99\" (UniqueName: \"kubernetes.io/projected/efad7b51-5842-4aff-abb9-6379ecca5cc4-kube-api-access-4xk99\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.683930 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-cni-netd\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-device-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.683965 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-system-cni-dir\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-cni-multus\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684044 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-multus-certs\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684070 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684094 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-registration-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684120 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-systemd\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684148 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-var-lib-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:22.684202 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684200 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-systemd-units\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684227 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-etc-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684247 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovnkube-config\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684273 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bb277a7c-b922-4c78-a4fd-5882a862b97a-hosts-file\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684319 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-sys-fs\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684342 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbg5l\" (UniqueName: \"kubernetes.io/projected/760ffb8e-f999-408c-901c-44fed91982db-kube-api-access-bbg5l\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684360 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cnibin\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:22.684529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.684410 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-host-slash\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.685585 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.685568 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.685696 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.685646 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:51:22.685696 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.685683 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:51:22.686080 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.686064 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hrtps\"" Apr 17 07:51:22.686863 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.686842 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:22.687830 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.687811 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jwllh\"" Apr 17 07:51:22.687830 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.687823 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.687970 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.687814 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.687970 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.687817 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.690536 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.690518 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.690640 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.690574 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.690829 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.690792 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kg88g\"" Apr 17 07:51:22.690875 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.690831 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:51:22.708005 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.707987 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ml2z7" Apr 17 07:51:22.715390 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.715371 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ml2z7" Apr 17 07:51:22.734060 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:22.733801 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90295ff52276f4357ebdbd2fe3bf0ff4.slice/crio-b26e54caa29afee50d61dab50ebd7f4a16003ec7257bd0f94c22a58943da7eec WatchSource:0}: Error finding container b26e54caa29afee50d61dab50ebd7f4a16003ec7257bd0f94c22a58943da7eec: Status 404 returned error can't find the container with id b26e54caa29afee50d61dab50ebd7f4a16003ec7257bd0f94c22a58943da7eec Apr 17 07:51:22.734384 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:22.734358 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1260d269b19f56cf5f4a135125663f49.slice/crio-6855dd2ee1b9cb63ba99edfb2759e07e87b2525281e4fc98c22b8b37bf028e23 WatchSource:0}: Error finding container 6855dd2ee1b9cb63ba99edfb2759e07e87b2525281e4fc98c22b8b37bf028e23: Status 404 returned error can't find the container with id 6855dd2ee1b9cb63ba99edfb2759e07e87b2525281e4fc98c22b8b37bf028e23 Apr 17 07:51:22.738878 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.738862 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:51:22.778849 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.778828 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:51:22.780673 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.780626 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" event={"ID":"1260d269b19f56cf5f4a135125663f49","Type":"ContainerStarted","Data":"6855dd2ee1b9cb63ba99edfb2759e07e87b2525281e4fc98c22b8b37bf028e23"} Apr 17 07:51:22.781530 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.781509 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" event={"ID":"90295ff52276f4357ebdbd2fe3bf0ff4","Type":"ContainerStarted","Data":"b26e54caa29afee50d61dab50ebd7f4a16003ec7257bd0f94c22a58943da7eec"} Apr 17 07:51:22.784773 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784755 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c6bafdd-ce56-49a3-8070-bd48e97302b2-tmp\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.784834 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784783 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/187a524e-8b69-4904-8985-6b33cf3dc3d1-agent-certs\") pod \"konnectivity-agent-vpzvb\" (UID: \"187a524e-8b69-4904-8985-6b33cf3dc3d1\") " pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:22.784834 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784805 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qnz\" (UniqueName: \"kubernetes.io/projected/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-kube-api-access-x9qnz\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.784928 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784844 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-os-release\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.784928 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784868 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-ovn\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.784928 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784886 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/187a524e-8b69-4904-8985-6b33cf3dc3d1-konnectivity-ca\") pod \"konnectivity-agent-vpzvb\" (UID: \"187a524e-8b69-4904-8985-6b33cf3dc3d1\") " pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:22.784928 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784910 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-iptables-alerter-script\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.785107 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784936 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-cnibin\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.785107 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784950 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-os-release\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.785107 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784961 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-netns\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.785107 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784937 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-ovn\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.785107 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785005 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-cnibin\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.785107 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.784985 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovnkube-script-lib\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.785107 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785044 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-netns\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.785107 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhrr\" (UniqueName: \"kubernetes.io/projected/bb277a7c-b922-4c78-a4fd-5882a862b97a-kube-api-access-vjhrr\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785139 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785179 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-slash\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785213 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-modprobe-d\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785245 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-lib-modules\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785271 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-slash\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785273 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-cni-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785313 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-socket-dir-parent\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.785453 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-cni-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-sys\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-socket-dir-parent\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785493 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cni-binary-copy\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785528 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-iptables-alerter-script\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785556 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785567 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovnkube-script-lib\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785584 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-env-overrides\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785611 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovn-node-metrics-cert\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.785737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785641 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysctl-conf\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785785 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-cni-bin\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spbd6\" (UniqueName: \"kubernetes.io/projected/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-kube-api-access-spbd6\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785850 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysctl-d\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785869 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-cni-bin\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785874 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-var-lib-kubelet\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785902 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x86r\" (UniqueName: \"kubernetes.io/projected/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-kube-api-access-5x86r\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785953 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-etc-selinux\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785975 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786031 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-env-overrides\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786119 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.785979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-kube-api-access-cj7n4\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786119 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cni-binary-copy\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.786156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-etc-selinux\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-daemon-config\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786199 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-run-netns\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786223 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7254920c-50ea-4fc4-b393-00fa4b69ad5b-serviceca\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786261 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-run-netns\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786311 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb277a7c-b922-4c78-a4fd-5882a862b97a-tmp-dir\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786341 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-cni-bin\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786365 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-conf-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786388 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-log-socket\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-cni-bin\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786415 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvcgl\" (UniqueName: \"kubernetes.io/projected/6c6bafdd-ce56-49a3-8070-bd48e97302b2-kube-api-access-hvcgl\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-log-socket\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786414 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-conf-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786451 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7254920c-50ea-4fc4-b393-00fa4b69ad5b-host\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-os-release\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efad7b51-5842-4aff-abb9-6379ecca5cc4-cni-binary-copy\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-os-release\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.786728 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-node-log\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-node-log\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-run-ovn-kubernetes\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786605 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb277a7c-b922-4c78-a4fd-5882a862b97a-tmp-dir\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786624 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-etc-kubernetes\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-run-ovn-kubernetes\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xk99\" (UniqueName: \"kubernetes.io/projected/efad7b51-5842-4aff-abb9-6379ecca5cc4-kube-api-access-4xk99\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786653 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efad7b51-5842-4aff-abb9-6379ecca5cc4-multus-daemon-config\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-cni-netd\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786674 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-etc-kubernetes\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786718 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-cni-netd\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-tuned\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786881 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-device-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786906 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-system-cni-dir\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-cni-multus\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efad7b51-5842-4aff-abb9-6379ecca5cc4-cni-binary-copy\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786950 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-multus-certs\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786979 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-device-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.787356 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-multus-certs\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-system-cni-dir\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787025 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.786982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-cni-multus\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysconfig\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-systemd\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787097 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-host\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-registration-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-registration-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787188 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-systemd\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787211 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-systemd\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-var-lib-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787259 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787303 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-systemd-units\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-var-lib-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-etc-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-systemd-units\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787357 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovnkube-config\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787383 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-etc-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787384 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-run\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bb277a7c-b922-4c78-a4fd-5882a862b97a-hosts-file\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787452 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-sys-fs\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bb277a7c-b922-4c78-a4fd-5882a862b97a-hosts-file\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787508 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-sys-fs\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787529 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbg5l\" (UniqueName: \"kubernetes.io/projected/760ffb8e-f999-408c-901c-44fed91982db-kube-api-access-bbg5l\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787558 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cnibin\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhcq\" (UniqueName: \"kubernetes.io/projected/7254920c-50ea-4fc4-b393-00fa4b69ad5b-kube-api-access-2dhcq\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787622 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-cnibin\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787637 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-host-slash\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-run-openvswitch\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.788557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-socket-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787714 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-k8s-cni-cncf-io\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:22.787732 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-host-slash\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-hostroot\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-run-k8s-cni-cncf-io\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-kubelet\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787790 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-hostroot\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-kubernetes\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-system-cni-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787877 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-host-kubelet\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787884 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/760ffb8e-f999-408c-901c-44fed91982db-socket-dir\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787894 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-kubelet\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787918 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-system-cni-dir\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:22.787931 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs podName:86e593a1-ee06-4a3a-9bef-3d1c3097b01d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:23.287916007 +0000 UTC m=+2.090866726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs") pod "network-metrics-daemon-nnkhx" (UID: "86e593a1-ee06-4a3a-9bef-3d1c3097b01d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.787934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efad7b51-5842-4aff-abb9-6379ecca5cc4-host-var-lib-kubelet\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.789053 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.788245 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovnkube-config\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.789529 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.789139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-ovn-node-metrics-cert\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.792616 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.792599 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qnz\" (UniqueName: \"kubernetes.io/projected/88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6-kube-api-access-x9qnz\") pod \"iptables-alerter-zn97z\" (UID: \"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6\") " pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:22.796328 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.796309 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhrr\" (UniqueName: \"kubernetes.io/projected/bb277a7c-b922-4c78-a4fd-5882a862b97a-kube-api-access-vjhrr\") pod \"node-resolver-4dz7x\" (UID: \"bb277a7c-b922-4c78-a4fd-5882a862b97a\") " pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:22.796467 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:22.796428 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:22.796467 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:22.796447 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:22.796467 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:22.796460 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mnmfq for pod openshift-network-diagnostics/network-check-target-gz9xl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:22.796637 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:22.796583 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq podName:c2f87d28-3811-45b6-bdd2-ca07124aa872 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:23.296496718 +0000 UTC m=+2.099447441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mnmfq" (UniqueName: "kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq") pod "network-check-target-gz9xl" (UID: "c2f87d28-3811-45b6-bdd2-ca07124aa872") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:22.798004 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.797977 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbd6\" (UniqueName: \"kubernetes.io/projected/a5237768-3d38-4f21-8b97-c1ffd5d7cec2-kube-api-access-spbd6\") pod \"ovnkube-node-4l4mp\" (UID: \"a5237768-3d38-4f21-8b97-c1ffd5d7cec2\") " pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:22.798090 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.798001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x86r\" (UniqueName: \"kubernetes.io/projected/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-kube-api-access-5x86r\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:22.798369 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.798354 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xk99\" (UniqueName: \"kubernetes.io/projected/efad7b51-5842-4aff-abb9-6379ecca5cc4-kube-api-access-4xk99\") pod \"multus-grgq6\" (UID: \"efad7b51-5842-4aff-abb9-6379ecca5cc4\") " pod="openshift-multus/multus-grgq6" Apr 17 07:51:22.798669 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.798652 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/b560c5d8-3216-49b3-be3c-2ad93d8b4e7a-kube-api-access-cj7n4\") pod \"multus-additional-cni-plugins-qgq6m\" (UID: \"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a\") " pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:22.798877 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.798859 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbg5l\" (UniqueName: \"kubernetes.io/projected/760ffb8e-f999-408c-901c-44fed91982db-kube-api-access-bbg5l\") pod \"aws-ebs-csi-driver-node-cr5t8\" (UID: \"760ffb8e-f999-408c-901c-44fed91982db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:22.888200 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-modprobe-d\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888200 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888157 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-lib-modules\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888200 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-sys\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysctl-conf\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysctl-d\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-var-lib-kubelet\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888275 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7254920c-50ea-4fc4-b393-00fa4b69ad5b-serviceca\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888301 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-lib-modules\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888313 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvcgl\" (UniqueName: \"kubernetes.io/projected/6c6bafdd-ce56-49a3-8070-bd48e97302b2-kube-api-access-hvcgl\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-modprobe-d\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888337 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7254920c-50ea-4fc4-b393-00fa4b69ad5b-host\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888366 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-tuned\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888382 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-sys\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888411 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysctl-d\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888418 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysctl-conf\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888458 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7254920c-50ea-4fc4-b393-00fa4b69ad5b-host\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888481 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-var-lib-kubelet\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.888481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888486 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysconfig\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888510 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-systemd\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888530 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-host\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888541 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-sysconfig\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888572 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-systemd\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-run\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888586 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-host\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888611 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhcq\" (UniqueName: \"kubernetes.io/projected/7254920c-50ea-4fc4-b393-00fa4b69ad5b-kube-api-access-2dhcq\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888642 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-run\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-kubernetes\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c6bafdd-ce56-49a3-8070-bd48e97302b2-tmp\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/187a524e-8b69-4904-8985-6b33cf3dc3d1-agent-certs\") pod \"konnectivity-agent-vpzvb\" (UID: \"187a524e-8b69-4904-8985-6b33cf3dc3d1\") " pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888736 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/187a524e-8b69-4904-8985-6b33cf3dc3d1-konnectivity-ca\") pod \"konnectivity-agent-vpzvb\" (UID: \"187a524e-8b69-4904-8985-6b33cf3dc3d1\") " pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888728 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-kubernetes\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.889051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.888811 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7254920c-50ea-4fc4-b393-00fa4b69ad5b-serviceca\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.889785 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.889206 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/187a524e-8b69-4904-8985-6b33cf3dc3d1-konnectivity-ca\") pod \"konnectivity-agent-vpzvb\" (UID: \"187a524e-8b69-4904-8985-6b33cf3dc3d1\") " pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:22.890657 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.890636 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6c6bafdd-ce56-49a3-8070-bd48e97302b2-etc-tuned\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.890773 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.890693 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c6bafdd-ce56-49a3-8070-bd48e97302b2-tmp\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:22.891708 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.891690 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/187a524e-8b69-4904-8985-6b33cf3dc3d1-agent-certs\") pod \"konnectivity-agent-vpzvb\" (UID: \"187a524e-8b69-4904-8985-6b33cf3dc3d1\") " pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:22.896679 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.896656 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhcq\" (UniqueName: \"kubernetes.io/projected/7254920c-50ea-4fc4-b393-00fa4b69ad5b-kube-api-access-2dhcq\") pod \"node-ca-ks9m9\" (UID: \"7254920c-50ea-4fc4-b393-00fa4b69ad5b\") " pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:22.896753 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:22.896687 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvcgl\" (UniqueName: \"kubernetes.io/projected/6c6bafdd-ce56-49a3-8070-bd48e97302b2-kube-api-access-hvcgl\") pod \"tuned-cqr65\" (UID: \"6c6bafdd-ce56-49a3-8070-bd48e97302b2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:23.001917 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.001886 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zn97z" Apr 17 07:51:23.008007 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.007980 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88aa5ccb_f3a4_4df2_90c3_b1450d9b5ad6.slice/crio-7b0485aae6fa9bc2711d0005b2dd0838e3b9a5a41fe69ea3ba7e52f398e16dbe WatchSource:0}: Error finding container 7b0485aae6fa9bc2711d0005b2dd0838e3b9a5a41fe69ea3ba7e52f398e16dbe: Status 404 returned error can't find the container with id 7b0485aae6fa9bc2711d0005b2dd0838e3b9a5a41fe69ea3ba7e52f398e16dbe Apr 17 07:51:23.017573 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.017552 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:23.024299 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.024259 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5237768_3d38_4f21_8b97_c1ffd5d7cec2.slice/crio-42fea56ee9aa42317e4c55029c0b05d26c9681f489dc7b72c6319043eccbfc5f WatchSource:0}: Error finding container 42fea56ee9aa42317e4c55029c0b05d26c9681f489dc7b72c6319043eccbfc5f: Status 404 returned error can't find the container with id 42fea56ee9aa42317e4c55029c0b05d26c9681f489dc7b72c6319043eccbfc5f Apr 17 07:51:23.039874 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.039827 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4dz7x" Apr 17 07:51:23.043574 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.043558 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" Apr 17 07:51:23.046061 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.046037 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb277a7c_b922_4c78_a4fd_5882a862b97a.slice/crio-8683dacd055979aa7a48fed893900f97f4bed5fbf01a21ee3d69f0ba4e9adb0a WatchSource:0}: Error finding container 8683dacd055979aa7a48fed893900f97f4bed5fbf01a21ee3d69f0ba4e9adb0a: Status 404 returned error can't find the container with id 8683dacd055979aa7a48fed893900f97f4bed5fbf01a21ee3d69f0ba4e9adb0a Apr 17 07:51:23.049126 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.049053 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" Apr 17 07:51:23.051108 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.051079 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod760ffb8e_f999_408c_901c_44fed91982db.slice/crio-ca4286c494878950240b0ddf01b328abe81e25df84457dbe3080fcb36e23ae1b WatchSource:0}: Error finding container ca4286c494878950240b0ddf01b328abe81e25df84457dbe3080fcb36e23ae1b: Status 404 returned error can't find the container with id ca4286c494878950240b0ddf01b328abe81e25df84457dbe3080fcb36e23ae1b Apr 17 07:51:23.056727 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.054463 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-grgq6" Apr 17 07:51:23.057763 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.057736 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb560c5d8_3216_49b3_be3c_2ad93d8b4e7a.slice/crio-9d6d93ca783754b21b771d7d9fc5f988e057e4f301a826baf1f09cbc4814200c WatchSource:0}: Error finding container 9d6d93ca783754b21b771d7d9fc5f988e057e4f301a826baf1f09cbc4814200c: Status 404 returned error can't find the container with id 9d6d93ca783754b21b771d7d9fc5f988e057e4f301a826baf1f09cbc4814200c Apr 17 07:51:23.060376 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.060354 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:23.062201 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.062183 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefad7b51_5842_4aff_abb9_6379ecca5cc4.slice/crio-950abf11df3b8fb34b22f3f7c4ade8f5fd3da74cba7d9ea7d4f64aa66eac5d68 WatchSource:0}: Error finding container 950abf11df3b8fb34b22f3f7c4ade8f5fd3da74cba7d9ea7d4f64aa66eac5d68: Status 404 returned error can't find the container with id 950abf11df3b8fb34b22f3f7c4ade8f5fd3da74cba7d9ea7d4f64aa66eac5d68 Apr 17 07:51:23.068138 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.068113 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod187a524e_8b69_4904_8985_6b33cf3dc3d1.slice/crio-115b078cadb980da4505b4b7e927f9594d54a1af1ca71161b0b9b0a9a5385992 WatchSource:0}: Error finding container 115b078cadb980da4505b4b7e927f9594d54a1af1ca71161b0b9b0a9a5385992: Status 404 returned error can't find the container with id 115b078cadb980da4505b4b7e927f9594d54a1af1ca71161b0b9b0a9a5385992 Apr 17 07:51:23.068281 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.068197 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cqr65" Apr 17 07:51:23.072848 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.072830 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ks9m9" Apr 17 07:51:23.075875 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.075777 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c6bafdd_ce56_49a3_8070_bd48e97302b2.slice/crio-818009da1da4167bf69342912f2faed4725bccb70dcb30fbdd4e45e3fe067eb4 WatchSource:0}: Error finding container 818009da1da4167bf69342912f2faed4725bccb70dcb30fbdd4e45e3fe067eb4: Status 404 returned error can't find the container with id 818009da1da4167bf69342912f2faed4725bccb70dcb30fbdd4e45e3fe067eb4 Apr 17 07:51:23.080965 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:51:23.080938 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7254920c_50ea_4fc4_b393_00fa4b69ad5b.slice/crio-131750ec35b2155be5880ad7a0f2cb05d3c879e6f39e0f872bca509b670c5c02 WatchSource:0}: Error finding container 131750ec35b2155be5880ad7a0f2cb05d3c879e6f39e0f872bca509b670c5c02: Status 404 returned error can't find the container with id 131750ec35b2155be5880ad7a0f2cb05d3c879e6f39e0f872bca509b670c5c02 Apr 17 07:51:23.292553 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.292377 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:23.292553 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:23.292534 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:23.292820 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:23.292602 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs podName:86e593a1-ee06-4a3a-9bef-3d1c3097b01d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:24.292580733 +0000 UTC m=+3.095531457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs") pod "network-metrics-daemon-nnkhx" (UID: "86e593a1-ee06-4a3a-9bef-3d1c3097b01d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:23.394244 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.392845 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:23.394244 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:23.392999 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:23.394244 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:23.393017 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:23.394244 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:23.393030 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mnmfq for pod openshift-network-diagnostics/network-check-target-gz9xl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:23.394244 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:23.393085 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq podName:c2f87d28-3811-45b6-bdd2-ca07124aa872 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:24.393067546 +0000 UTC m=+3.196018268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnmfq" (UniqueName: "kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq") pod "network-check-target-gz9xl" (UID: "c2f87d28-3811-45b6-bdd2-ca07124aa872") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:23.451037 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.451005 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:23.716824 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.716667 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:22 +0000 UTC" deadline="2027-11-09 00:41:22.614966189 +0000 UTC" Apr 17 07:51:23.716824 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.716703 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13696h49m58.898266707s" Apr 17 07:51:23.811126 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.811089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-grgq6" event={"ID":"efad7b51-5842-4aff-abb9-6379ecca5cc4","Type":"ContainerStarted","Data":"950abf11df3b8fb34b22f3f7c4ade8f5fd3da74cba7d9ea7d4f64aa66eac5d68"} Apr 17 07:51:23.817978 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.817940 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" event={"ID":"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a","Type":"ContainerStarted","Data":"9d6d93ca783754b21b771d7d9fc5f988e057e4f301a826baf1f09cbc4814200c"} Apr 17 07:51:23.826706 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.826381 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:23.834676 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.834639 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" event={"ID":"760ffb8e-f999-408c-901c-44fed91982db","Type":"ContainerStarted","Data":"ca4286c494878950240b0ddf01b328abe81e25df84457dbe3080fcb36e23ae1b"} Apr 17 07:51:23.846702 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.846599 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4dz7x" event={"ID":"bb277a7c-b922-4c78-a4fd-5882a862b97a","Type":"ContainerStarted","Data":"8683dacd055979aa7a48fed893900f97f4bed5fbf01a21ee3d69f0ba4e9adb0a"} Apr 17 07:51:23.855582 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.855545 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"42fea56ee9aa42317e4c55029c0b05d26c9681f489dc7b72c6319043eccbfc5f"} Apr 17 07:51:23.861525 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.861468 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cqr65" event={"ID":"6c6bafdd-ce56-49a3-8070-bd48e97302b2","Type":"ContainerStarted","Data":"818009da1da4167bf69342912f2faed4725bccb70dcb30fbdd4e45e3fe067eb4"} Apr 17 07:51:23.866337 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.866272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zn97z" event={"ID":"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6","Type":"ContainerStarted","Data":"7b0485aae6fa9bc2711d0005b2dd0838e3b9a5a41fe69ea3ba7e52f398e16dbe"} Apr 17 07:51:23.881959 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.881919 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ks9m9" event={"ID":"7254920c-50ea-4fc4-b393-00fa4b69ad5b","Type":"ContainerStarted","Data":"131750ec35b2155be5880ad7a0f2cb05d3c879e6f39e0f872bca509b670c5c02"} Apr 17 07:51:23.887434 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:23.887372 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vpzvb" event={"ID":"187a524e-8b69-4904-8985-6b33cf3dc3d1","Type":"ContainerStarted","Data":"115b078cadb980da4505b4b7e927f9594d54a1af1ca71161b0b9b0a9a5385992"} Apr 17 07:51:24.139094 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:24.138835 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:24.298866 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:24.298828 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:24.299062 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:24.298975 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:24.299062 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:24.299039 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs podName:86e593a1-ee06-4a3a-9bef-3d1c3097b01d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:26.299019845 +0000 UTC m=+5.101970570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs") pod "network-metrics-daemon-nnkhx" (UID: "86e593a1-ee06-4a3a-9bef-3d1c3097b01d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:24.400234 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:24.400145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:24.400407 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:24.400351 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:24.400407 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:24.400370 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:24.400407 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:24.400384 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mnmfq for pod openshift-network-diagnostics/network-check-target-gz9xl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:24.400582 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:24.400439 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq podName:c2f87d28-3811-45b6-bdd2-ca07124aa872 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:26.400422242 +0000 UTC m=+5.203372962 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnmfq" (UniqueName: "kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq") pod "network-check-target-gz9xl" (UID: "c2f87d28-3811-45b6-bdd2-ca07124aa872") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:24.717539 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:24.717440 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:22 +0000 UTC" deadline="2027-12-27 00:40:35.037187462 +0000 UTC" Apr 17 07:51:24.717539 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:24.717492 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14848h49m10.319700214s" Apr 17 07:51:24.778935 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:24.778900 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:24.779130 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:24.779039 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:24.779544 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:24.779518 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:24.779678 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:24.779637 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:26.318933 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:26.318895 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:26.319449 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:26.319044 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:26.319449 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:26.319120 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs podName:86e593a1-ee06-4a3a-9bef-3d1c3097b01d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.31909376 +0000 UTC m=+9.122044493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs") pod "network-metrics-daemon-nnkhx" (UID: "86e593a1-ee06-4a3a-9bef-3d1c3097b01d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:26.420046 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:26.420011 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:26.420253 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:26.420206 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:26.420253 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:26.420229 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:26.420387 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:26.420276 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mnmfq for pod openshift-network-diagnostics/network-check-target-gz9xl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:26.420387 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:26.420357 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq podName:c2f87d28-3811-45b6-bdd2-ca07124aa872 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.420337546 +0000 UTC m=+9.223288279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnmfq" (UniqueName: "kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq") pod "network-check-target-gz9xl" (UID: "c2f87d28-3811-45b6-bdd2-ca07124aa872") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:26.779271 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:26.779176 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:26.779471 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:26.779341 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:26.779794 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:26.779773 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:26.779893 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:26.779873 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:28.779084 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:28.779049 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:28.779550 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:28.779181 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:28.779613 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:28.779559 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:28.779706 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:28.779657 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:30.352690 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:30.352649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:30.353250 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:30.352814 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.353250 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:30.352866 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs podName:86e593a1-ee06-4a3a-9bef-3d1c3097b01d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:38.352852525 +0000 UTC m=+17.155803244 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs") pod "network-metrics-daemon-nnkhx" (UID: "86e593a1-ee06-4a3a-9bef-3d1c3097b01d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.453576 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:30.453470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:30.453771 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:30.453635 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:30.453771 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:30.453656 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:30.453771 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:30.453667 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mnmfq for pod openshift-network-diagnostics/network-check-target-gz9xl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.453771 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:30.453733 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq podName:c2f87d28-3811-45b6-bdd2-ca07124aa872 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:38.453715261 +0000 UTC m=+17.256665993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnmfq" (UniqueName: "kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq") pod "network-check-target-gz9xl" (UID: "c2f87d28-3811-45b6-bdd2-ca07124aa872") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.778268 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:30.778212 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:30.778437 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:30.778360 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:30.778943 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:30.778772 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:30.778943 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:30.778876 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:32.778735 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:32.778692 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:32.779185 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:32.778693 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:32.779185 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:32.778818 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:32.779185 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:32.778947 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:34.779158 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:34.779118 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:34.779609 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:34.779119 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:34.779609 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:34.779254 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:34.779609 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:34.779341 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:36.779245 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:36.779213 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:36.779670 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:36.779213 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:36.779670 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:36.779359 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:36.779670 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:36.779413 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:38.408127 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:38.408080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:38.408644 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:38.408223 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:38.408644 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:38.408320 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs podName:86e593a1-ee06-4a3a-9bef-3d1c3097b01d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:54.408298712 +0000 UTC m=+33.211249445 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs") pod "network-metrics-daemon-nnkhx" (UID: "86e593a1-ee06-4a3a-9bef-3d1c3097b01d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:38.508708 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:38.508668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:38.508941 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:38.508829 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:38.508941 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:38.508849 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:38.508941 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:38.508860 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mnmfq for pod openshift-network-diagnostics/network-check-target-gz9xl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:38.508941 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:38.508922 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq podName:c2f87d28-3811-45b6-bdd2-ca07124aa872 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:54.508907965 +0000 UTC m=+33.311858685 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnmfq" (UniqueName: "kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq") pod "network-check-target-gz9xl" (UID: "c2f87d28-3811-45b6-bdd2-ca07124aa872") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:38.778539 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:38.778505 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:38.778791 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:38.778505 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:38.778791 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:38.778616 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:38.778791 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:38.778684 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:40.778942 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:40.778906 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:40.778942 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:40.778936 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:40.779368 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:40.778999 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:40.779368 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:40.779126 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:41.929518 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.929234 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" event={"ID":"90295ff52276f4357ebdbd2fe3bf0ff4","Type":"ContainerStarted","Data":"de0028bba42a47700fe0992cc5ed9dd3d6684883212ea0c6c1cae442d8b69fac"} Apr 17 07:51:41.935556 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.935394 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-grgq6" event={"ID":"efad7b51-5842-4aff-abb9-6379ecca5cc4","Type":"ContainerStarted","Data":"389fd28cd634ecf9a3adccc0a8558354502d537042d7034165366a7a7f628949"} Apr 17 07:51:41.937722 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.937702 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"939b83ec619c3e8a835ab087ff981ccf3823e9142f8a3049170b7e0bd3f11973"} Apr 17 07:51:41.937816 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.937730 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"96228b10eb7555c14802fdebe5fc8351c941f4fb5b9964c8f6c72f7cb3f25a4d"} Apr 17 07:51:41.937816 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.937744 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"e114a0270c344f72d7b8e00fd651d03984e5602f0dfc47620a9a91462a7b726a"} Apr 17 07:51:41.937816 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.937754 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"4f1dc144f2638da77cbf509bd24c7f7ab7faed906fb72bb3fd9ca77323755d7b"} Apr 17 07:51:41.939193 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.939167 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cqr65" event={"ID":"6c6bafdd-ce56-49a3-8070-bd48e97302b2","Type":"ContainerStarted","Data":"3895b08c005bfb753fca6f7f24122b15d4fcdacc8230cbce957423e5c36ad8be"} Apr 17 07:51:41.943735 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.943698 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-28.ec2.internal" podStartSLOduration=20.943686494 podStartE2EDuration="20.943686494s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:41.943271695 +0000 UTC m=+20.746222435" watchObservedRunningTime="2026-04-17 07:51:41.943686494 +0000 UTC m=+20.746637239" Apr 17 07:51:41.959757 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.959719 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-grgq6" podStartSLOduration=2.583048196 podStartE2EDuration="20.959707269s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.064504286 +0000 UTC m=+1.867455022" lastFinishedPulling="2026-04-17 07:51:41.441163373 +0000 UTC m=+20.244114095" observedRunningTime="2026-04-17 07:51:41.95892946 +0000 UTC m=+20.761880201" watchObservedRunningTime="2026-04-17 07:51:41.959707269 +0000 UTC m=+20.762658039" Apr 17 07:51:41.973512 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:41.973467 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cqr65" podStartSLOduration=2.807901372 podStartE2EDuration="20.973454076s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.078130722 +0000 UTC m=+1.881081448" lastFinishedPulling="2026-04-17 07:51:41.243683425 +0000 UTC m=+20.046634152" observedRunningTime="2026-04-17 07:51:41.973378813 +0000 UTC m=+20.776329555" watchObservedRunningTime="2026-04-17 07:51:41.973454076 +0000 UTC m=+20.776404817" Apr 17 07:51:42.778407 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.778209 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:42.778580 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.778209 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:42.778580 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:42.778510 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:42.778580 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:42.778548 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:42.943007 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.942982 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 07:51:42.943561 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.943256 2567 generic.go:358] "Generic (PLEG): container finished" podID="a5237768-3d38-4f21-8b97-c1ffd5d7cec2" containerID="e114a0270c344f72d7b8e00fd651d03984e5602f0dfc47620a9a91462a7b726a" exitCode=1 Apr 17 07:51:42.943561 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.943313 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerDied","Data":"e114a0270c344f72d7b8e00fd651d03984e5602f0dfc47620a9a91462a7b726a"} Apr 17 07:51:42.943561 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.943347 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"aac12279e6ebf13af03ff5d2cac5b40b6b61d27fa164bace7b3d34096981b881"} Apr 17 07:51:42.943561 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.943361 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"6f7628ae52a969364ec5d08b137ae71ab67003c0f011e75b94ade9c80b6f9c71"} Apr 17 07:51:42.944421 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.944398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zn97z" event={"ID":"88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6","Type":"ContainerStarted","Data":"31fa170404e685224677f7b1f2a96f7780178512e6f9404390c1c88512ca6773"} Apr 17 07:51:42.959608 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:42.959567 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zn97z" podStartSLOduration=3.727086458 podStartE2EDuration="21.959556541s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.009608231 +0000 UTC m=+1.812558954" lastFinishedPulling="2026-04-17 07:51:41.242078304 +0000 UTC m=+20.045029037" observedRunningTime="2026-04-17 07:51:42.959512897 +0000 UTC m=+21.762463638" watchObservedRunningTime="2026-04-17 07:51:42.959556541 +0000 UTC m=+21.762507283" Apr 17 07:51:44.778249 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.778218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:44.778677 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:44.778368 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:44.778677 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.778409 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:44.778677 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:44.778512 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:44.949799 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.949515 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ks9m9" event={"ID":"7254920c-50ea-4fc4-b393-00fa4b69ad5b","Type":"ContainerStarted","Data":"70ead7b2a2661152f744659c4df2875c2fe6a8138a2ffe680a402e2878e230c4"} Apr 17 07:51:44.950760 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.950728 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vpzvb" event={"ID":"187a524e-8b69-4904-8985-6b33cf3dc3d1","Type":"ContainerStarted","Data":"a4d8ac8ffdc6991650eadbc79eecba412a106d49e5e35b713b086fe75b62a4ac"} Apr 17 07:51:44.952084 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.952060 2567 generic.go:358] "Generic (PLEG): container finished" podID="b560c5d8-3216-49b3-be3c-2ad93d8b4e7a" containerID="8fc615e8a5b767d88c59186bacc374bfd22fa848aff11ee80ca6dd87971f645a" exitCode=0 Apr 17 07:51:44.952192 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.952126 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" event={"ID":"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a","Type":"ContainerDied","Data":"8fc615e8a5b767d88c59186bacc374bfd22fa848aff11ee80ca6dd87971f645a"} Apr 17 07:51:44.953517 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.953474 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" event={"ID":"760ffb8e-f999-408c-901c-44fed91982db","Type":"ContainerStarted","Data":"ecba103e852b3d60f434d1e69249cbcf9c02cdcf7270be41b8693124494d43f7"} Apr 17 07:51:44.954647 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.954623 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4dz7x" event={"ID":"bb277a7c-b922-4c78-a4fd-5882a862b97a","Type":"ContainerStarted","Data":"752433662ab6a1846b4b67247dd1be93f5efabe35bf2e9797f0999eb7ee22e2e"} Apr 17 07:51:44.957216 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.957200 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 07:51:44.957686 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.957667 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"2dc6d0f731246a9ccfd3c70cdd84a4c471e1be7119f7145af247d78f760b4217"} Apr 17 07:51:44.958823 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.958805 2567 generic.go:358] "Generic (PLEG): container finished" podID="1260d269b19f56cf5f4a135125663f49" containerID="7dba1c0a725ab4da3674bc773b4d44b2bce28596cbb7c0dcc9adb6b468de8091" exitCode=0 Apr 17 07:51:44.958881 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.958835 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" event={"ID":"1260d269b19f56cf5f4a135125663f49","Type":"ContainerDied","Data":"7dba1c0a725ab4da3674bc773b4d44b2bce28596cbb7c0dcc9adb6b468de8091"} Apr 17 07:51:44.964450 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.964417 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ks9m9" podStartSLOduration=5.831573628 podStartE2EDuration="23.96440682s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.08278981 +0000 UTC m=+1.885740535" lastFinishedPulling="2026-04-17 07:51:41.215622995 +0000 UTC m=+20.018573727" observedRunningTime="2026-04-17 07:51:44.964161387 +0000 UTC m=+23.767112129" watchObservedRunningTime="2026-04-17 07:51:44.96440682 +0000 UTC m=+23.767357560" Apr 17 07:51:44.977652 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:44.977618 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4dz7x" podStartSLOduration=5.809730287 podStartE2EDuration="23.977607628s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.047744138 +0000 UTC m=+1.850694858" lastFinishedPulling="2026-04-17 07:51:41.21562148 +0000 UTC m=+20.018572199" observedRunningTime="2026-04-17 07:51:44.977330858 +0000 UTC m=+23.780281600" watchObservedRunningTime="2026-04-17 07:51:44.977607628 +0000 UTC m=+23.780558369" Apr 17 07:51:45.932672 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:45.932633 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:51:45.962978 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:45.962946 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" event={"ID":"760ffb8e-f999-408c-901c-44fed91982db","Type":"ContainerStarted","Data":"4e69aaa7ac791ac0ccfb2bcf217a5db32b75ac17d64418a95a44c2473bc0e29a"} Apr 17 07:51:45.977681 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:45.977635 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vpzvb" podStartSLOduration=6.832058367 podStartE2EDuration="24.977619301s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.070049464 +0000 UTC m=+1.873000185" lastFinishedPulling="2026-04-17 07:51:41.215610382 +0000 UTC m=+20.018561119" observedRunningTime="2026-04-17 07:51:45.010354926 +0000 UTC m=+23.813305672" watchObservedRunningTime="2026-04-17 07:51:45.977619301 +0000 UTC m=+24.780570042" Apr 17 07:51:46.355642 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.355441 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:46.646353 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.646215 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:46.647212 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.647193 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:46.746316 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.746199 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:51:45.932653642Z","UUID":"cc1ed47d-5ca7-4b23-9f00-b49ee53b1c4e","Handler":null,"Name":"","Endpoint":""} Apr 17 07:51:46.748501 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.748476 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:51:46.749208 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.748923 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:51:46.778571 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.778540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:46.778711 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.778547 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:46.778711 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:46.778651 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:46.778826 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:46.778781 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:46.968259 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.968225 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" event={"ID":"760ffb8e-f999-408c-901c-44fed91982db","Type":"ContainerStarted","Data":"64745843dad4acbb813db3e55f3fe8c900e54a08c1f1cc9a1e5994486f2e6223"} Apr 17 07:51:46.972640 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.972620 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 07:51:46.973035 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.972998 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"6b8663ad9903efe124cc2be7331e53206dc1450b77c5d458e1dc7f336698329f"} Apr 17 07:51:46.973567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.973545 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:46.973672 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.973582 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:46.973672 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.973588 2567 scope.go:117] "RemoveContainer" containerID="e114a0270c344f72d7b8e00fd651d03984e5602f0dfc47620a9a91462a7b726a" Apr 17 07:51:46.973778 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.973596 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:46.975013 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.974988 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" event={"ID":"1260d269b19f56cf5f4a135125663f49","Type":"ContainerStarted","Data":"7c69bc447345795ba4d899663ce9545a45fd86f3779907676240dc475a7c87ee"} Apr 17 07:51:46.975706 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.975690 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vpzvb" Apr 17 07:51:46.990883 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.990853 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:46.991892 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:46.991798 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:51:47.061597 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:47.061545 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cr5t8" podStartSLOduration=2.306980693 podStartE2EDuration="26.061528982s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.052959568 +0000 UTC m=+1.855910288" lastFinishedPulling="2026-04-17 07:51:46.807507859 +0000 UTC m=+25.610458577" observedRunningTime="2026-04-17 07:51:46.988563587 +0000 UTC m=+25.791514362" watchObservedRunningTime="2026-04-17 07:51:47.061528982 +0000 UTC m=+25.864479774" Apr 17 07:51:47.120258 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:47.120206 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-28.ec2.internal" podStartSLOduration=26.120190438 podStartE2EDuration="26.120190438s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:47.094252635 +0000 UTC m=+25.897203375" watchObservedRunningTime="2026-04-17 07:51:47.120190438 +0000 UTC m=+25.923141180" Apr 17 07:51:47.980756 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:47.980721 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 07:51:47.981160 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:47.981098 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" event={"ID":"a5237768-3d38-4f21-8b97-c1ffd5d7cec2","Type":"ContainerStarted","Data":"0371c451d3f1e9734b7c2197db36dbdba8fd9f7d2c8630d865dddb063debdfd9"} Apr 17 07:51:48.019903 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:48.019852 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" podStartSLOduration=8.762754252 podStartE2EDuration="27.019833604s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.026004866 +0000 UTC m=+1.828955589" lastFinishedPulling="2026-04-17 07:51:41.283084219 +0000 UTC m=+20.086034941" observedRunningTime="2026-04-17 07:51:48.01922535 +0000 UTC m=+26.822176090" watchObservedRunningTime="2026-04-17 07:51:48.019833604 +0000 UTC m=+26.822784345" Apr 17 07:51:48.401582 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:48.401344 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gz9xl"] Apr 17 07:51:48.401761 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:48.401704 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:48.401847 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:48.401823 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:48.404226 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:48.403921 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nnkhx"] Apr 17 07:51:48.404226 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:48.404058 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:48.404226 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:48.404176 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:49.778902 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:49.778869 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:49.779537 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:49.778988 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:49.986263 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:49.986227 2567 generic.go:358] "Generic (PLEG): container finished" podID="b560c5d8-3216-49b3-be3c-2ad93d8b4e7a" containerID="adf8ea075d6d276d13b9a2531b4d9e44076d70f0fe7e9f19f56c67e70954c5a4" exitCode=0 Apr 17 07:51:49.986466 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:49.986307 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" event={"ID":"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a","Type":"ContainerDied","Data":"adf8ea075d6d276d13b9a2531b4d9e44076d70f0fe7e9f19f56c67e70954c5a4"} Apr 17 07:51:50.778795 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:50.778766 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:50.778959 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:50.778895 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:50.990277 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:50.990241 2567 generic.go:358] "Generic (PLEG): container finished" podID="b560c5d8-3216-49b3-be3c-2ad93d8b4e7a" containerID="163890b38541ac7a1ffc81e255c2d7ed9a021fb412e0bc4ad5082504727f5c94" exitCode=0 Apr 17 07:51:50.990434 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:50.990327 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" event={"ID":"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a","Type":"ContainerDied","Data":"163890b38541ac7a1ffc81e255c2d7ed9a021fb412e0bc4ad5082504727f5c94"} Apr 17 07:51:51.779506 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:51.779235 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:51.780012 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:51.779519 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:51.999026 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:51.998984 2567 generic.go:358] "Generic (PLEG): container finished" podID="b560c5d8-3216-49b3-be3c-2ad93d8b4e7a" containerID="ef5e4d01e75fa9c79c0f477aaea8f80c6a10fede5da5560aba95220c76daffcf" exitCode=0 Apr 17 07:51:51.999179 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:51.999043 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" event={"ID":"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a","Type":"ContainerDied","Data":"ef5e4d01e75fa9c79c0f477aaea8f80c6a10fede5da5560aba95220c76daffcf"} Apr 17 07:51:52.778839 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:52.778795 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:52.779006 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:52.778932 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gz9xl" podUID="c2f87d28-3811-45b6-bdd2-ca07124aa872" Apr 17 07:51:53.778923 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:53.778883 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:53.779405 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:53.779012 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nnkhx" podUID="86e593a1-ee06-4a3a-9bef-3d1c3097b01d" Apr 17 07:51:54.423198 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.423161 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:54.423396 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.423308 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:54.423396 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.423376 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs podName:86e593a1-ee06-4a3a-9bef-3d1c3097b01d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.42335537 +0000 UTC m=+65.226306107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs") pod "network-metrics-daemon-nnkhx" (UID: "86e593a1-ee06-4a3a-9bef-3d1c3097b01d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:54.474176 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.474148 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-28.ec2.internal" event="NodeReady" Apr 17 07:51:54.474365 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.474340 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:51:54.517428 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.517393 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mspd5"] Apr 17 07:51:54.521382 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.521352 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sm78q"] Apr 17 07:51:54.521555 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.521543 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.524114 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.524089 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:54.524274 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.524248 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:54.524274 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.524272 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:54.524438 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.524301 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mnmfq for pod openshift-network-diagnostics/network-check-target-gz9xl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:54.524438 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.524333 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mw88v\"" Apr 17 07:51:54.524438 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.524355 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq podName:c2f87d28-3811-45b6-bdd2-ca07124aa872 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.524337072 +0000 UTC m=+65.327287808 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnmfq" (UniqueName: "kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq") pod "network-check-target-gz9xl" (UID: "c2f87d28-3811-45b6-bdd2-ca07124aa872") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:54.524438 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.524366 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:51:54.524438 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.524395 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:51:54.525268 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.525252 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:54.527594 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.527575 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:51:54.527696 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.527636 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:51:54.528317 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.528138 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:51:54.528621 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.528586 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9vrx5\"" Apr 17 07:51:54.529331 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.529268 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mspd5"] Apr 17 07:51:54.532995 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.532975 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sm78q"] Apr 17 07:51:54.624569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.624524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5e3cf222-71f9-4a25-88bb-37c528ac2994-tmp-dir\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.624753 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.624586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:54.624753 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.624619 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6p92\" (UniqueName: \"kubernetes.io/projected/fcbda289-b762-45ea-ba60-5188e612db63-kube-api-access-g6p92\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:54.624753 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.624641 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e3cf222-71f9-4a25-88bb-37c528ac2994-config-volume\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.624753 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.624671 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7h7c\" (UniqueName: \"kubernetes.io/projected/5e3cf222-71f9-4a25-88bb-37c528ac2994-kube-api-access-z7h7c\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.624753 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.624718 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.726036 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.725947 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.726036 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.726015 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5e3cf222-71f9-4a25-88bb-37c528ac2994-tmp-dir\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.726036 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.726039 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:54.726360 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.726122 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:54.726360 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.726161 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6p92\" (UniqueName: \"kubernetes.io/projected/fcbda289-b762-45ea-ba60-5188e612db63-kube-api-access-g6p92\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:54.726360 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.726210 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls podName:5e3cf222-71f9-4a25-88bb-37c528ac2994 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:55.226188321 +0000 UTC m=+34.029139057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls") pod "dns-default-mspd5" (UID: "5e3cf222-71f9-4a25-88bb-37c528ac2994") : secret "dns-default-metrics-tls" not found Apr 17 07:51:54.726360 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.726127 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:54.726360 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.726230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e3cf222-71f9-4a25-88bb-37c528ac2994-config-volume\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.726360 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:54.726273 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert podName:fcbda289-b762-45ea-ba60-5188e612db63 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:55.226254734 +0000 UTC m=+34.029205458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert") pod "ingress-canary-sm78q" (UID: "fcbda289-b762-45ea-ba60-5188e612db63") : secret "canary-serving-cert" not found Apr 17 07:51:54.726360 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.726360 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7h7c\" (UniqueName: \"kubernetes.io/projected/5e3cf222-71f9-4a25-88bb-37c528ac2994-kube-api-access-z7h7c\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.726965 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.726939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5e3cf222-71f9-4a25-88bb-37c528ac2994-tmp-dir\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.727064 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.726781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e3cf222-71f9-4a25-88bb-37c528ac2994-config-volume\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.739130 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.739099 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7h7c\" (UniqueName: \"kubernetes.io/projected/5e3cf222-71f9-4a25-88bb-37c528ac2994-kube-api-access-z7h7c\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:54.739330 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.739107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6p92\" (UniqueName: \"kubernetes.io/projected/fcbda289-b762-45ea-ba60-5188e612db63-kube-api-access-g6p92\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:54.779181 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.779144 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:51:54.781726 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.781701 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:51:54.781846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.781730 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bmlbc\"" Apr 17 07:51:54.781846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:54.781732 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:51:55.230445 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:55.230407 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:55.230741 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:55.230486 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:55.230741 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:55.230586 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:55.230741 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:55.230612 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:55.230741 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:55.230679 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls podName:5e3cf222-71f9-4a25-88bb-37c528ac2994 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:56.230658617 +0000 UTC m=+35.033609337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls") pod "dns-default-mspd5" (UID: "5e3cf222-71f9-4a25-88bb-37c528ac2994") : secret "dns-default-metrics-tls" not found Apr 17 07:51:55.230741 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:55.230698 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert podName:fcbda289-b762-45ea-ba60-5188e612db63 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:56.230689415 +0000 UTC m=+35.033640134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert") pod "ingress-canary-sm78q" (UID: "fcbda289-b762-45ea-ba60-5188e612db63") : secret "canary-serving-cert" not found Apr 17 07:51:55.778790 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:55.778751 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:51:55.782980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:55.782948 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjzzs\"" Apr 17 07:51:55.783483 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:55.782948 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:51:56.239700 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:56.239598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:56.239700 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:56.239686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:56.239889 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:56.239770 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:56.239889 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:56.239819 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:56.239889 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:56.239889 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls podName:5e3cf222-71f9-4a25-88bb-37c528ac2994 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:58.239875888 +0000 UTC m=+37.042826606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls") pod "dns-default-mspd5" (UID: "5e3cf222-71f9-4a25-88bb-37c528ac2994") : secret "dns-default-metrics-tls" not found Apr 17 07:51:56.240028 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:56.239903 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert podName:fcbda289-b762-45ea-ba60-5188e612db63 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:58.239896884 +0000 UTC m=+37.042847603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert") pod "ingress-canary-sm78q" (UID: "fcbda289-b762-45ea-ba60-5188e612db63") : secret "canary-serving-cert" not found Apr 17 07:51:58.014191 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:58.014103 2567 generic.go:358] "Generic (PLEG): container finished" podID="b560c5d8-3216-49b3-be3c-2ad93d8b4e7a" containerID="855c40b8a6648191a9e302d9f16214ee80accfe9187dae6f8b022a90daa8926f" exitCode=0 Apr 17 07:51:58.014191 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:58.014162 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" event={"ID":"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a","Type":"ContainerDied","Data":"855c40b8a6648191a9e302d9f16214ee80accfe9187dae6f8b022a90daa8926f"} Apr 17 07:51:58.256021 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:58.255984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:51:58.256179 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:58.256067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:51:58.256179 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:58.256128 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:58.256279 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:58.256187 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:58.256279 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:58.256198 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls podName:5e3cf222-71f9-4a25-88bb-37c528ac2994 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:02.256181571 +0000 UTC m=+41.059132289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls") pod "dns-default-mspd5" (UID: "5e3cf222-71f9-4a25-88bb-37c528ac2994") : secret "dns-default-metrics-tls" not found Apr 17 07:51:58.256279 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:51:58.256235 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert podName:fcbda289-b762-45ea-ba60-5188e612db63 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:02.256219388 +0000 UTC m=+41.059170121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert") pod "ingress-canary-sm78q" (UID: "fcbda289-b762-45ea-ba60-5188e612db63") : secret "canary-serving-cert" not found Apr 17 07:51:59.018528 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:59.018490 2567 generic.go:358] "Generic (PLEG): container finished" podID="b560c5d8-3216-49b3-be3c-2ad93d8b4e7a" containerID="b63a7ab03dfef68e606d4b12b6fbf31917e451019e22360fdea25923c2e890ff" exitCode=0 Apr 17 07:51:59.018911 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:51:59.018560 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" event={"ID":"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a","Type":"ContainerDied","Data":"b63a7ab03dfef68e606d4b12b6fbf31917e451019e22360fdea25923c2e890ff"} Apr 17 07:52:00.023831 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:00.023648 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" event={"ID":"b560c5d8-3216-49b3-be3c-2ad93d8b4e7a","Type":"ContainerStarted","Data":"2d599fa963c07f8bbad8fe7fdda7f24d02c379fbb793ccd44c06d6bfbe95689e"} Apr 17 07:52:00.046387 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:00.046333 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qgq6m" podStartSLOduration=4.595054314 podStartE2EDuration="39.046319166s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.059419236 +0000 UTC m=+1.862369954" lastFinishedPulling="2026-04-17 07:51:57.510684083 +0000 UTC m=+36.313634806" observedRunningTime="2026-04-17 07:52:00.044932278 +0000 UTC m=+38.847883034" watchObservedRunningTime="2026-04-17 07:52:00.046319166 +0000 UTC m=+38.849269901" Apr 17 07:52:02.285906 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:02.285867 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:52:02.286387 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:02.285927 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:52:02.286387 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:02.286018 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:02.286387 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:02.286025 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:02.286387 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:02.286079 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert podName:fcbda289-b762-45ea-ba60-5188e612db63 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.286065303 +0000 UTC m=+49.089016023 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert") pod "ingress-canary-sm78q" (UID: "fcbda289-b762-45ea-ba60-5188e612db63") : secret "canary-serving-cert" not found Apr 17 07:52:02.286387 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:02.286093 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls podName:5e3cf222-71f9-4a25-88bb-37c528ac2994 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.286086835 +0000 UTC m=+49.089037554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls") pod "dns-default-mspd5" (UID: "5e3cf222-71f9-4a25-88bb-37c528ac2994") : secret "dns-default-metrics-tls" not found Apr 17 07:52:10.341272 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:10.341225 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:52:10.341811 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:10.341317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:52:10.341811 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:10.341393 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:10.341811 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:10.341414 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:10.341811 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:10.341471 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert podName:fcbda289-b762-45ea-ba60-5188e612db63 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.341455247 +0000 UTC m=+65.144405967 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert") pod "ingress-canary-sm78q" (UID: "fcbda289-b762-45ea-ba60-5188e612db63") : secret "canary-serving-cert" not found Apr 17 07:52:10.341811 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:10.341486 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls podName:5e3cf222-71f9-4a25-88bb-37c528ac2994 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.341479669 +0000 UTC m=+65.144430387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls") pod "dns-default-mspd5" (UID: "5e3cf222-71f9-4a25-88bb-37c528ac2994") : secret "dns-default-metrics-tls" not found Apr 17 07:52:18.992934 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:18.992905 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4l4mp" Apr 17 07:52:26.350843 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.350804 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:52:26.351213 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.350879 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:52:26.351213 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:26.350965 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:26.351213 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:26.350965 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:26.351213 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:26.351038 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls podName:5e3cf222-71f9-4a25-88bb-37c528ac2994 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:58.351022875 +0000 UTC m=+97.153973597 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls") pod "dns-default-mspd5" (UID: "5e3cf222-71f9-4a25-88bb-37c528ac2994") : secret "dns-default-metrics-tls" not found Apr 17 07:52:26.351213 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:26.351052 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert podName:fcbda289-b762-45ea-ba60-5188e612db63 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:58.351046878 +0000 UTC m=+97.153997597 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert") pod "ingress-canary-sm78q" (UID: "fcbda289-b762-45ea-ba60-5188e612db63") : secret "canary-serving-cert" not found Apr 17 07:52:26.451330 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.451282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:52:26.453807 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.453787 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:26.461747 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:26.461722 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:26.461832 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:26.461778 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs podName:86e593a1-ee06-4a3a-9bef-3d1c3097b01d nodeName:}" failed. No retries permitted until 2026-04-17 07:53:30.46176386 +0000 UTC m=+129.264714583 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs") pod "network-metrics-daemon-nnkhx" (UID: "86e593a1-ee06-4a3a-9bef-3d1c3097b01d") : secret "metrics-daemon-secret" not found Apr 17 07:52:26.552480 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.552448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:52:26.555199 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.555180 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:26.565672 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.565644 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:26.577709 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.577683 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnmfq\" (UniqueName: \"kubernetes.io/projected/c2f87d28-3811-45b6-bdd2-ca07124aa872-kube-api-access-mnmfq\") pod \"network-check-target-gz9xl\" (UID: \"c2f87d28-3811-45b6-bdd2-ca07124aa872\") " pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:52:26.591721 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.591696 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bmlbc\"" Apr 17 07:52:26.599585 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.599563 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:52:26.749501 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:26.749477 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gz9xl"] Apr 17 07:52:26.759838 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:52:26.759815 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f87d28_3811_45b6_bdd2_ca07124aa872.slice/crio-90cfe8315714f2748c62fc7ffc7bae48b98abac456cf2ee10e67996ed77a1400 WatchSource:0}: Error finding container 90cfe8315714f2748c62fc7ffc7bae48b98abac456cf2ee10e67996ed77a1400: Status 404 returned error can't find the container with id 90cfe8315714f2748c62fc7ffc7bae48b98abac456cf2ee10e67996ed77a1400 Apr 17 07:52:27.073013 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:27.072974 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gz9xl" event={"ID":"c2f87d28-3811-45b6-bdd2-ca07124aa872","Type":"ContainerStarted","Data":"90cfe8315714f2748c62fc7ffc7bae48b98abac456cf2ee10e67996ed77a1400"} Apr 17 07:52:30.079638 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:30.079603 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gz9xl" event={"ID":"c2f87d28-3811-45b6-bdd2-ca07124aa872","Type":"ContainerStarted","Data":"eee44a0e218ebea382fea69a0a5e2013ae902dfc806b61c2ae37478069f67a19"} Apr 17 07:52:30.080092 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:30.079747 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:52:30.095518 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:30.095466 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-gz9xl" podStartSLOduration=66.256997478 podStartE2EDuration="1m9.095453735s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:52:26.761956319 +0000 UTC m=+65.564907039" lastFinishedPulling="2026-04-17 07:52:29.600412573 +0000 UTC m=+68.403363296" observedRunningTime="2026-04-17 07:52:30.094736771 +0000 UTC m=+68.897687512" watchObservedRunningTime="2026-04-17 07:52:30.095453735 +0000 UTC m=+68.898404526" Apr 17 07:52:58.373533 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:58.373494 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:52:58.373968 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:52:58.373552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:52:58.373968 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:58.373640 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:58.373968 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:58.373642 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:58.373968 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:58.373699 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls podName:5e3cf222-71f9-4a25-88bb-37c528ac2994 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.373685054 +0000 UTC m=+161.176635772 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls") pod "dns-default-mspd5" (UID: "5e3cf222-71f9-4a25-88bb-37c528ac2994") : secret "dns-default-metrics-tls" not found Apr 17 07:52:58.373968 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:52:58.373714 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert podName:fcbda289-b762-45ea-ba60-5188e612db63 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.373706942 +0000 UTC m=+161.176657661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert") pod "ingress-canary-sm78q" (UID: "fcbda289-b762-45ea-ba60-5188e612db63") : secret "canary-serving-cert" not found Apr 17 07:53:01.083773 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:01.083744 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-gz9xl" Apr 17 07:53:10.434541 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.434511 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz"] Apr 17 07:53:10.438756 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.438733 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.443574 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.443536 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 07:53:10.443574 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.443552 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:53:10.443574 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.443547 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fglcm\"" Apr 17 07:53:10.443751 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.443550 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:53:10.443751 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.443652 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 07:53:10.446510 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.446476 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz"] Apr 17 07:53:10.448426 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.448407 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6ed2baca-4a17-4906-9829-56274b0374d5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.448534 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.448435 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5kk\" (UniqueName: \"kubernetes.io/projected/6ed2baca-4a17-4906-9829-56274b0374d5-kube-api-access-jp5kk\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.448534 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.448464 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.532887 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.532856 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b"] Apr 17 07:53:10.535707 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.535660 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b" Apr 17 07:53:10.536499 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.536474 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt"] Apr 17 07:53:10.538220 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.538162 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-8d5h6\"" Apr 17 07:53:10.538366 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.538341 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:10.538480 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.538351 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:10.539185 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.539155 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.539446 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.539421 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lnwwn"] Apr 17 07:53:10.541468 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.541433 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 07:53:10.541567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.541497 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:10.541765 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.541744 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 07:53:10.541950 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.541935 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4"] Apr 17 07:53:10.542077 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.542060 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-mr665\"" Apr 17 07:53:10.542143 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.542066 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.542315 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.542280 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:10.544919 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.544904 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:10.545470 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.545452 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 07:53:10.545543 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.545497 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 07:53:10.545598 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.545584 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-47gcl\"" Apr 17 07:53:10.545953 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.545940 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:10.546759 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.546743 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:10.548178 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.548157 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 07:53:10.548380 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.548360 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:10.548661 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.548629 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-66v7q\"" Apr 17 07:53:10.549673 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.549649 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm9bn\" (UniqueName: \"kubernetes.io/projected/4fe916f9-75e5-450b-9686-68166482e8a8-kube-api-access-hm9bn\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.549765 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.549703 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fe916f9-75e5-450b-9686-68166482e8a8-trusted-ca\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.549818 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.549793 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fe916f9-75e5-450b-9686-68166482e8a8-config\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.549878 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.549819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9503e60-cd11-4c96-a718-f33e86501791-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.549878 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.549857 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6b52\" (UniqueName: \"kubernetes.io/projected/f9503e60-cd11-4c96-a718-f33e86501791-kube-api-access-s6b52\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.549988 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.549924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6ed2baca-4a17-4906-9829-56274b0374d5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.549988 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.549953 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5kk\" (UniqueName: \"kubernetes.io/projected/6ed2baca-4a17-4906-9829-56274b0374d5-kube-api-access-jp5kk\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.550085 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.549985 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7c6c\" (UniqueName: \"kubernetes.io/projected/13e57040-8bcb-45c5-9813-b2b4749fd4e4-kube-api-access-h7c6c\") pod \"volume-data-source-validator-7c6cbb6c87-7xz9b\" (UID: \"13e57040-8bcb-45c5-9813-b2b4749fd4e4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b" Apr 17 07:53:10.550085 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.550023 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9503e60-cd11-4c96-a718-f33e86501791-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.550179 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.550164 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.550572 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.550552 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:10.553964 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:10.553946 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:10.554522 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.554165 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe916f9-75e5-450b-9686-68166482e8a8-serving-cert\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.555733 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.555706 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b"] Apr 17 07:53:10.555959 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:10.555943 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls podName:6ed2baca-4a17-4906-9829-56274b0374d5 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:11.055907007 +0000 UTC m=+109.858857731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m4cqz" (UID: "6ed2baca-4a17-4906-9829-56274b0374d5") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:10.556061 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.555986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6ed2baca-4a17-4906-9829-56274b0374d5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.556745 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.556579 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt"] Apr 17 07:53:10.557712 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.557690 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lnwwn"] Apr 17 07:53:10.558199 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.558182 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 07:53:10.558627 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.558606 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4"] Apr 17 07:53:10.567266 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.567247 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5kk\" (UniqueName: \"kubernetes.io/projected/6ed2baca-4a17-4906-9829-56274b0374d5-kube-api-access-jp5kk\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:10.655340 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655304 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fe916f9-75e5-450b-9686-68166482e8a8-config\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.655512 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9503e60-cd11-4c96-a718-f33e86501791-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.655512 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6b52\" (UniqueName: \"kubernetes.io/projected/f9503e60-cd11-4c96-a718-f33e86501791-kube-api-access-s6b52\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.655512 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655434 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7c6c\" (UniqueName: \"kubernetes.io/projected/13e57040-8bcb-45c5-9813-b2b4749fd4e4-kube-api-access-h7c6c\") pod \"volume-data-source-validator-7c6cbb6c87-7xz9b\" (UID: \"13e57040-8bcb-45c5-9813-b2b4749fd4e4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b" Apr 17 07:53:10.655512 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9503e60-cd11-4c96-a718-f33e86501791-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.655776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe916f9-75e5-450b-9686-68166482e8a8-serving-cert\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.655776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbkv6\" (UniqueName: \"kubernetes.io/projected/60c87110-0aed-4648-8660-2c08620770a1-kube-api-access-xbkv6\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:10.655776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655585 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:10.655776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655613 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hm9bn\" (UniqueName: \"kubernetes.io/projected/4fe916f9-75e5-450b-9686-68166482e8a8-kube-api-access-hm9bn\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.655776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.655650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fe916f9-75e5-450b-9686-68166482e8a8-trusted-ca\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.656364 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.656331 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fe916f9-75e5-450b-9686-68166482e8a8-config\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.656492 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.656470 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9503e60-cd11-4c96-a718-f33e86501791-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.656568 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.656547 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fe916f9-75e5-450b-9686-68166482e8a8-trusted-ca\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.657973 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.657949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9503e60-cd11-4c96-a718-f33e86501791-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.658604 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.658584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe916f9-75e5-450b-9686-68166482e8a8-serving-cert\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.664300 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.664254 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7c6c\" (UniqueName: \"kubernetes.io/projected/13e57040-8bcb-45c5-9813-b2b4749fd4e4-kube-api-access-h7c6c\") pod \"volume-data-source-validator-7c6cbb6c87-7xz9b\" (UID: \"13e57040-8bcb-45c5-9813-b2b4749fd4e4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b" Apr 17 07:53:10.664599 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.664581 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6b52\" (UniqueName: \"kubernetes.io/projected/f9503e60-cd11-4c96-a718-f33e86501791-kube-api-access-s6b52\") pod \"kube-storage-version-migrator-operator-6769c5d45-fwwvt\" (UID: \"f9503e60-cd11-4c96-a718-f33e86501791\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.664669 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.664652 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm9bn\" (UniqueName: \"kubernetes.io/projected/4fe916f9-75e5-450b-9686-68166482e8a8-kube-api-access-hm9bn\") pod \"console-operator-9d4b6777b-lnwwn\" (UID: \"4fe916f9-75e5-450b-9686-68166482e8a8\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:10.756787 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.756753 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbkv6\" (UniqueName: \"kubernetes.io/projected/60c87110-0aed-4648-8660-2c08620770a1-kube-api-access-xbkv6\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:10.756965 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.756805 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:10.756965 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:10.756945 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:53:10.757074 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:10.757027 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls podName:60c87110-0aed-4648-8660-2c08620770a1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:11.257006148 +0000 UTC m=+110.059956882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qhzj4" (UID: "60c87110-0aed-4648-8660-2c08620770a1") : secret "samples-operator-tls" not found Apr 17 07:53:10.765828 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.765793 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbkv6\" (UniqueName: \"kubernetes.io/projected/60c87110-0aed-4648-8660-2c08620770a1-kube-api-access-xbkv6\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:10.847066 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.847024 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b" Apr 17 07:53:10.859967 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.859941 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" Apr 17 07:53:10.865609 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:10.865587 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:11.011601 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:11.011576 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b"] Apr 17 07:53:11.013700 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:11.013672 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13e57040_8bcb_45c5_9813_b2b4749fd4e4.slice/crio-970f9f2b7693a5de3833b79d7600d6da1e0813370c99a20545847dc6ecb0fec4 WatchSource:0}: Error finding container 970f9f2b7693a5de3833b79d7600d6da1e0813370c99a20545847dc6ecb0fec4: Status 404 returned error can't find the container with id 970f9f2b7693a5de3833b79d7600d6da1e0813370c99a20545847dc6ecb0fec4 Apr 17 07:53:11.059405 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:11.059367 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:11.059548 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:11.059476 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:11.059548 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:11.059533 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls podName:6ed2baca-4a17-4906-9829-56274b0374d5 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:12.059518538 +0000 UTC m=+110.862469258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m4cqz" (UID: "6ed2baca-4a17-4906-9829-56274b0374d5") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:11.159677 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:11.159639 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b" event={"ID":"13e57040-8bcb-45c5-9813-b2b4749fd4e4","Type":"ContainerStarted","Data":"970f9f2b7693a5de3833b79d7600d6da1e0813370c99a20545847dc6ecb0fec4"} Apr 17 07:53:11.233249 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:11.233218 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lnwwn"] Apr 17 07:53:11.236163 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:11.236129 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fe916f9_75e5_450b_9686_68166482e8a8.slice/crio-cd08ab199af2c70e44b26c95e018e18245f1adc02b352380ba7157f017a28776 WatchSource:0}: Error finding container cd08ab199af2c70e44b26c95e018e18245f1adc02b352380ba7157f017a28776: Status 404 returned error can't find the container with id cd08ab199af2c70e44b26c95e018e18245f1adc02b352380ba7157f017a28776 Apr 17 07:53:11.236545 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:11.236519 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt"] Apr 17 07:53:11.241059 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:11.241029 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9503e60_cd11_4c96_a718_f33e86501791.slice/crio-28291d594ccd627bac4529561661acf7d5b6e71eea1548705698e5a948bcc88c WatchSource:0}: Error finding container 28291d594ccd627bac4529561661acf7d5b6e71eea1548705698e5a948bcc88c: Status 404 returned error can't find the container with id 28291d594ccd627bac4529561661acf7d5b6e71eea1548705698e5a948bcc88c Apr 17 07:53:11.260350 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:11.260269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:11.260463 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:11.260418 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:53:11.260510 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:11.260477 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls podName:60c87110-0aed-4648-8660-2c08620770a1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:12.260461971 +0000 UTC m=+111.063412691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qhzj4" (UID: "60c87110-0aed-4648-8660-2c08620770a1") : secret "samples-operator-tls" not found Apr 17 07:53:12.066807 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:12.066759 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:12.067230 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:12.067001 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:12.067230 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:12.067072 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls podName:6ed2baca-4a17-4906-9829-56274b0374d5 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:14.067051151 +0000 UTC m=+112.870001875 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m4cqz" (UID: "6ed2baca-4a17-4906-9829-56274b0374d5") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:12.163971 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:12.163908 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" event={"ID":"4fe916f9-75e5-450b-9686-68166482e8a8","Type":"ContainerStarted","Data":"cd08ab199af2c70e44b26c95e018e18245f1adc02b352380ba7157f017a28776"} Apr 17 07:53:12.165206 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:12.165174 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" event={"ID":"f9503e60-cd11-4c96-a718-f33e86501791","Type":"ContainerStarted","Data":"28291d594ccd627bac4529561661acf7d5b6e71eea1548705698e5a948bcc88c"} Apr 17 07:53:12.268921 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:12.268885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:12.269113 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:12.269061 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:53:12.269179 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:12.269144 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls podName:60c87110-0aed-4648-8660-2c08620770a1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:14.269122141 +0000 UTC m=+113.072072866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qhzj4" (UID: "60c87110-0aed-4648-8660-2c08620770a1") : secret "samples-operator-tls" not found Apr 17 07:53:13.168468 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:13.168432 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b" event={"ID":"13e57040-8bcb-45c5-9813-b2b4749fd4e4","Type":"ContainerStarted","Data":"e29feb696102280836378fca019fdc32013a4fa163ef07c9e0a59e2a1b08e2b3"} Apr 17 07:53:13.184241 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:13.184188 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7xz9b" podStartSLOduration=1.8451138139999999 podStartE2EDuration="3.184171022s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="2026-04-17 07:53:11.015537686 +0000 UTC m=+109.818488405" lastFinishedPulling="2026-04-17 07:53:12.354594884 +0000 UTC m=+111.157545613" observedRunningTime="2026-04-17 07:53:13.183497754 +0000 UTC m=+111.986448498" watchObservedRunningTime="2026-04-17 07:53:13.184171022 +0000 UTC m=+111.987121763" Apr 17 07:53:14.083476 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.083448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:14.083642 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:14.083610 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:14.083712 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:14.083700 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls podName:6ed2baca-4a17-4906-9829-56274b0374d5 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:18.083680318 +0000 UTC m=+116.886631050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m4cqz" (UID: "6ed2baca-4a17-4906-9829-56274b0374d5") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:14.171712 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.171691 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/0.log" Apr 17 07:53:14.172063 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.171729 2567 generic.go:358] "Generic (PLEG): container finished" podID="4fe916f9-75e5-450b-9686-68166482e8a8" containerID="64c426afc84238fc3f08191a1757ad05874c3672d96031e314cf43554dfc64cb" exitCode=255 Apr 17 07:53:14.172063 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.171798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" event={"ID":"4fe916f9-75e5-450b-9686-68166482e8a8","Type":"ContainerDied","Data":"64c426afc84238fc3f08191a1757ad05874c3672d96031e314cf43554dfc64cb"} Apr 17 07:53:14.172063 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.172033 2567 scope.go:117] "RemoveContainer" containerID="64c426afc84238fc3f08191a1757ad05874c3672d96031e314cf43554dfc64cb" Apr 17 07:53:14.173321 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.173268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" event={"ID":"f9503e60-cd11-4c96-a718-f33e86501791","Type":"ContainerStarted","Data":"54e754585b57eb56dcedbf16307fb400718a255e9bc1ee24160b115e2efb329c"} Apr 17 07:53:14.200815 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.200774 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" podStartSLOduration=1.40134042 podStartE2EDuration="4.200758181s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="2026-04-17 07:53:11.242839901 +0000 UTC m=+110.045790623" lastFinishedPulling="2026-04-17 07:53:14.042257662 +0000 UTC m=+112.845208384" observedRunningTime="2026-04-17 07:53:14.200215822 +0000 UTC m=+113.003166564" watchObservedRunningTime="2026-04-17 07:53:14.200758181 +0000 UTC m=+113.003708921" Apr 17 07:53:14.284396 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.284364 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:14.284606 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:14.284534 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:53:14.284670 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:14.284610 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls podName:60c87110-0aed-4648-8660-2c08620770a1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:18.284595682 +0000 UTC m=+117.087546404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qhzj4" (UID: "60c87110-0aed-4648-8660-2c08620770a1") : secret "samples-operator-tls" not found Apr 17 07:53:14.513247 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.513215 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92"] Apr 17 07:53:14.519614 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.519583 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92" Apr 17 07:53:14.522169 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.522143 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-bczfs\"" Apr 17 07:53:14.523187 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.523166 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92"] Apr 17 07:53:14.587056 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.586954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvf7p\" (UniqueName: \"kubernetes.io/projected/ecac921a-12f0-4bcc-ac34-e10db9b1ae9a-kube-api-access-cvf7p\") pod \"network-check-source-8894fc9bd-gnk92\" (UID: \"ecac921a-12f0-4bcc-ac34-e10db9b1ae9a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92" Apr 17 07:53:14.687923 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.687885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvf7p\" (UniqueName: \"kubernetes.io/projected/ecac921a-12f0-4bcc-ac34-e10db9b1ae9a-kube-api-access-cvf7p\") pod \"network-check-source-8894fc9bd-gnk92\" (UID: \"ecac921a-12f0-4bcc-ac34-e10db9b1ae9a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92" Apr 17 07:53:14.695982 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.695954 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvf7p\" (UniqueName: \"kubernetes.io/projected/ecac921a-12f0-4bcc-ac34-e10db9b1ae9a-kube-api-access-cvf7p\") pod \"network-check-source-8894fc9bd-gnk92\" (UID: \"ecac921a-12f0-4bcc-ac34-e10db9b1ae9a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92" Apr 17 07:53:14.829618 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.829559 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92" Apr 17 07:53:14.943488 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:14.943459 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92"] Apr 17 07:53:14.946265 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:14.946235 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecac921a_12f0_4bcc_ac34_e10db9b1ae9a.slice/crio-dabef4b4d0bcde917bff3d7382de80d60270b1a8abe848c87e49751ad7df4cbd WatchSource:0}: Error finding container dabef4b4d0bcde917bff3d7382de80d60270b1a8abe848c87e49751ad7df4cbd: Status 404 returned error can't find the container with id dabef4b4d0bcde917bff3d7382de80d60270b1a8abe848c87e49751ad7df4cbd Apr 17 07:53:15.177128 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.177051 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 07:53:15.177567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.177456 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/0.log" Apr 17 07:53:15.177567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.177489 2567 generic.go:358] "Generic (PLEG): container finished" podID="4fe916f9-75e5-450b-9686-68166482e8a8" containerID="536ea78cc80e898a65c00ded88d82ebf219f16d88a024180bb8239c7fb889ab8" exitCode=255 Apr 17 07:53:15.177567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.177525 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" event={"ID":"4fe916f9-75e5-450b-9686-68166482e8a8","Type":"ContainerDied","Data":"536ea78cc80e898a65c00ded88d82ebf219f16d88a024180bb8239c7fb889ab8"} Apr 17 07:53:15.177730 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.177574 2567 scope.go:117] "RemoveContainer" containerID="64c426afc84238fc3f08191a1757ad05874c3672d96031e314cf43554dfc64cb" Apr 17 07:53:15.177897 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.177873 2567 scope.go:117] "RemoveContainer" containerID="536ea78cc80e898a65c00ded88d82ebf219f16d88a024180bb8239c7fb889ab8" Apr 17 07:53:15.178110 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:15.178091 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lnwwn_openshift-console-operator(4fe916f9-75e5-450b-9686-68166482e8a8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" podUID="4fe916f9-75e5-450b-9686-68166482e8a8" Apr 17 07:53:15.179021 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.178986 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92" event={"ID":"ecac921a-12f0-4bcc-ac34-e10db9b1ae9a","Type":"ContainerStarted","Data":"338b2bde8da7e15d142f4341bd471fc8efbe8dfa97c128cc3862582a3345f8d1"} Apr 17 07:53:15.179213 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.179030 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92" event={"ID":"ecac921a-12f0-4bcc-ac34-e10db9b1ae9a","Type":"ContainerStarted","Data":"dabef4b4d0bcde917bff3d7382de80d60270b1a8abe848c87e49751ad7df4cbd"} Apr 17 07:53:15.208625 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:15.208576 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gnk92" podStartSLOduration=1.20855937 podStartE2EDuration="1.20855937s" podCreationTimestamp="2026-04-17 07:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:53:15.207864332 +0000 UTC m=+114.010815075" watchObservedRunningTime="2026-04-17 07:53:15.20855937 +0000 UTC m=+114.011510112" Apr 17 07:53:16.183232 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:16.183198 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 07:53:16.183755 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:16.183689 2567 scope.go:117] "RemoveContainer" containerID="536ea78cc80e898a65c00ded88d82ebf219f16d88a024180bb8239c7fb889ab8" Apr 17 07:53:16.183891 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:16.183872 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lnwwn_openshift-console-operator(4fe916f9-75e5-450b-9686-68166482e8a8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" podUID="4fe916f9-75e5-450b-9686-68166482e8a8" Apr 17 07:53:16.261377 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:16.261348 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4dz7x_bb277a7c-b922-4c78-a4fd-5882a862b97a/dns-node-resolver/0.log" Apr 17 07:53:17.461319 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.461272 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ks9m9_7254920c-50ea-4fc4-b393-00fa4b69ad5b/node-ca/0.log" Apr 17 07:53:17.595582 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.595541 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pddds"] Apr 17 07:53:17.599662 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.599638 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.602013 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.601979 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 07:53:17.602013 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.602005 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 07:53:17.602198 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.602090 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-sr4cw\"" Apr 17 07:53:17.602198 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.602147 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 07:53:17.603046 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.603031 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 07:53:17.607607 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.607589 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pddds"] Apr 17 07:53:17.713505 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.713413 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24bd488c-a178-4a74-96e9-1d6e269355e5-signing-key\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.713505 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.713467 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24bd488c-a178-4a74-96e9-1d6e269355e5-signing-cabundle\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.713505 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.713492 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n54lv\" (UniqueName: \"kubernetes.io/projected/24bd488c-a178-4a74-96e9-1d6e269355e5-kube-api-access-n54lv\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.814833 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.814796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24bd488c-a178-4a74-96e9-1d6e269355e5-signing-cabundle\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.814833 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.814840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n54lv\" (UniqueName: \"kubernetes.io/projected/24bd488c-a178-4a74-96e9-1d6e269355e5-kube-api-access-n54lv\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.815066 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.814960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24bd488c-a178-4a74-96e9-1d6e269355e5-signing-key\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.815594 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.815568 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/24bd488c-a178-4a74-96e9-1d6e269355e5-signing-cabundle\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.817532 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.817505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/24bd488c-a178-4a74-96e9-1d6e269355e5-signing-key\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.823163 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.823141 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n54lv\" (UniqueName: \"kubernetes.io/projected/24bd488c-a178-4a74-96e9-1d6e269355e5-kube-api-access-n54lv\") pod \"service-ca-865cb79987-pddds\" (UID: \"24bd488c-a178-4a74-96e9-1d6e269355e5\") " pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:17.909818 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:17.909787 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pddds" Apr 17 07:53:18.034620 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:18.034588 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pddds"] Apr 17 07:53:18.037582 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:18.037553 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24bd488c_a178_4a74_96e9_1d6e269355e5.slice/crio-6cec4dd37e0e8fde41248cbc13a934cbc6df71c8b07c27d11965d0d780646dca WatchSource:0}: Error finding container 6cec4dd37e0e8fde41248cbc13a934cbc6df71c8b07c27d11965d0d780646dca: Status 404 returned error can't find the container with id 6cec4dd37e0e8fde41248cbc13a934cbc6df71c8b07c27d11965d0d780646dca Apr 17 07:53:18.117364 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:18.117329 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:18.117535 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:18.117496 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:18.117577 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:18.117567 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls podName:6ed2baca-4a17-4906-9829-56274b0374d5 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:26.117551825 +0000 UTC m=+124.920502548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m4cqz" (UID: "6ed2baca-4a17-4906-9829-56274b0374d5") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:18.189783 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:18.189740 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pddds" event={"ID":"24bd488c-a178-4a74-96e9-1d6e269355e5","Type":"ContainerStarted","Data":"6cec4dd37e0e8fde41248cbc13a934cbc6df71c8b07c27d11965d0d780646dca"} Apr 17 07:53:18.319000 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:18.318961 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:18.319181 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:18.319118 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:53:18.319228 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:18.319194 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls podName:60c87110-0aed-4648-8660-2c08620770a1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:26.319177594 +0000 UTC m=+125.122128319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qhzj4" (UID: "60c87110-0aed-4648-8660-2c08620770a1") : secret "samples-operator-tls" not found Apr 17 07:53:20.196281 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:20.196242 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pddds" event={"ID":"24bd488c-a178-4a74-96e9-1d6e269355e5","Type":"ContainerStarted","Data":"0fa57152717e00af20e1cd9fa5a6506dd33054fd90bc1d3be6b263b47789496f"} Apr 17 07:53:20.212215 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:20.212125 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-pddds" podStartSLOduration=1.540378853 podStartE2EDuration="3.212106343s" podCreationTimestamp="2026-04-17 07:53:17 +0000 UTC" firstStartedPulling="2026-04-17 07:53:18.039441154 +0000 UTC m=+116.842391873" lastFinishedPulling="2026-04-17 07:53:19.711168641 +0000 UTC m=+118.514119363" observedRunningTime="2026-04-17 07:53:20.211622489 +0000 UTC m=+119.014573230" watchObservedRunningTime="2026-04-17 07:53:20.212106343 +0000 UTC m=+119.015057088" Apr 17 07:53:20.866261 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:20.866229 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:20.866261 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:20.866268 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:20.866727 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:20.866713 2567 scope.go:117] "RemoveContainer" containerID="536ea78cc80e898a65c00ded88d82ebf219f16d88a024180bb8239c7fb889ab8" Apr 17 07:53:20.866981 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:20.866961 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lnwwn_openshift-console-operator(4fe916f9-75e5-450b-9686-68166482e8a8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" podUID="4fe916f9-75e5-450b-9686-68166482e8a8" Apr 17 07:53:26.185829 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:26.185789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:26.186363 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:26.185927 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:26.186363 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:26.185983 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls podName:6ed2baca-4a17-4906-9829-56274b0374d5 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:42.185969293 +0000 UTC m=+140.988920012 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m4cqz" (UID: "6ed2baca-4a17-4906-9829-56274b0374d5") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:26.388623 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:26.388564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:26.391892 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:26.391868 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60c87110-0aed-4648-8660-2c08620770a1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qhzj4\" (UID: \"60c87110-0aed-4648-8660-2c08620770a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:26.477627 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:26.477543 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-66v7q\"" Apr 17 07:53:26.485639 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:26.485607 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" Apr 17 07:53:26.606277 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:26.606245 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4"] Apr 17 07:53:27.216194 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:27.216151 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" event={"ID":"60c87110-0aed-4648-8660-2c08620770a1","Type":"ContainerStarted","Data":"7ee8368dadd2241ffb0df46dda9680ad736303d06161b7606319165a01b73d41"} Apr 17 07:53:29.223255 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:29.223223 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" event={"ID":"60c87110-0aed-4648-8660-2c08620770a1","Type":"ContainerStarted","Data":"ccfafab05c9e9cb9a78c91f7a4a44b0762c039f512fbcd8d90f95ffc658f04ee"} Apr 17 07:53:29.223255 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:29.223257 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" event={"ID":"60c87110-0aed-4648-8660-2c08620770a1","Type":"ContainerStarted","Data":"5e65743884809f9390d0dc87858c56f19fa51f37b6d61aafdc07e79b954fef51"} Apr 17 07:53:29.238014 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:29.237963 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qhzj4" podStartSLOduration=17.488436707 podStartE2EDuration="19.237948014s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="2026-04-17 07:53:26.647693701 +0000 UTC m=+125.450644421" lastFinishedPulling="2026-04-17 07:53:28.397205007 +0000 UTC m=+127.200155728" observedRunningTime="2026-04-17 07:53:29.237339401 +0000 UTC m=+128.040290141" watchObservedRunningTime="2026-04-17 07:53:29.237948014 +0000 UTC m=+128.040898754" Apr 17 07:53:30.521575 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:30.521539 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:53:30.523988 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:30.523964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e593a1-ee06-4a3a-9bef-3d1c3097b01d-metrics-certs\") pod \"network-metrics-daemon-nnkhx\" (UID: \"86e593a1-ee06-4a3a-9bef-3d1c3097b01d\") " pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:53:30.594579 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:30.594547 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjzzs\"" Apr 17 07:53:30.602595 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:30.602577 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nnkhx" Apr 17 07:53:30.723764 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:30.723734 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nnkhx"] Apr 17 07:53:30.727594 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:30.727558 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e593a1_ee06_4a3a_9bef_3d1c3097b01d.slice/crio-f19031ebae6db8f1c88bfbfa5acea404aebf13f66ef844646070668df82e7c91 WatchSource:0}: Error finding container f19031ebae6db8f1c88bfbfa5acea404aebf13f66ef844646070668df82e7c91: Status 404 returned error can't find the container with id f19031ebae6db8f1c88bfbfa5acea404aebf13f66ef844646070668df82e7c91 Apr 17 07:53:31.230179 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:31.230142 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nnkhx" event={"ID":"86e593a1-ee06-4a3a-9bef-3d1c3097b01d","Type":"ContainerStarted","Data":"f19031ebae6db8f1c88bfbfa5acea404aebf13f66ef844646070668df82e7c91"} Apr 17 07:53:32.778626 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:32.778599 2567 scope.go:117] "RemoveContainer" containerID="536ea78cc80e898a65c00ded88d82ebf219f16d88a024180bb8239c7fb889ab8" Apr 17 07:53:33.237931 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:33.237837 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 07:53:33.238114 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:33.237975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" event={"ID":"4fe916f9-75e5-450b-9686-68166482e8a8","Type":"ContainerStarted","Data":"e2299386831f613079c653537098ebe332305cbc20bd85ad8550e078eda01a22"} Apr 17 07:53:33.238913 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:33.238878 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:33.240702 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:33.240678 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nnkhx" event={"ID":"86e593a1-ee06-4a3a-9bef-3d1c3097b01d","Type":"ContainerStarted","Data":"be9c77ae938f536fe6ede0ff4e3ba41fe53230da217c00c8951f69338f63d401"} Apr 17 07:53:33.240787 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:33.240725 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nnkhx" event={"ID":"86e593a1-ee06-4a3a-9bef-3d1c3097b01d","Type":"ContainerStarted","Data":"1142a6d6406542e69a97cddd081618d9901ce158c0b1b181f52cd3eafe9a0054"} Apr 17 07:53:33.245178 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:33.245152 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" Apr 17 07:53:33.256012 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:33.255968 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-lnwwn" podStartSLOduration=20.454704992 podStartE2EDuration="23.255955801s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="2026-04-17 07:53:11.238232918 +0000 UTC m=+110.041183637" lastFinishedPulling="2026-04-17 07:53:14.039483724 +0000 UTC m=+112.842434446" observedRunningTime="2026-04-17 07:53:33.254948408 +0000 UTC m=+132.057899175" watchObservedRunningTime="2026-04-17 07:53:33.255955801 +0000 UTC m=+132.058906570" Apr 17 07:53:33.287925 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:33.287878 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nnkhx" podStartSLOduration=130.701407847 podStartE2EDuration="2m12.287864288s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:53:30.729489779 +0000 UTC m=+129.532440498" lastFinishedPulling="2026-04-17 07:53:32.31594622 +0000 UTC m=+131.118896939" observedRunningTime="2026-04-17 07:53:33.287094179 +0000 UTC m=+132.090044914" watchObservedRunningTime="2026-04-17 07:53:33.287864288 +0000 UTC m=+132.090815022" Apr 17 07:53:41.097110 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.097074 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-22l88"] Apr 17 07:53:41.100203 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.100187 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" Apr 17 07:53:41.102667 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.102642 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 07:53:41.103540 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.103514 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 07:53:41.103540 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.103532 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hhtmj\"" Apr 17 07:53:41.108595 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.108574 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-22l88"] Apr 17 07:53:41.205596 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.205554 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0aec996c-f5f7-4af3-8685-2febd74582db-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-22l88\" (UID: \"0aec996c-f5f7-4af3-8685-2febd74582db\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" Apr 17 07:53:41.205782 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.205613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0aec996c-f5f7-4af3-8685-2febd74582db-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-22l88\" (UID: \"0aec996c-f5f7-4af3-8685-2febd74582db\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" Apr 17 07:53:41.212729 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.212701 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-tw9dk"] Apr 17 07:53:41.219342 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.219319 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tw9dk" Apr 17 07:53:41.228702 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.228681 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:53:41.229725 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.229707 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:53:41.234063 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.234047 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-jgsmk\"" Apr 17 07:53:41.255428 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.255399 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tw9dk"] Apr 17 07:53:41.256455 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.256432 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-f6tcx"] Apr 17 07:53:41.259448 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.259430 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.262184 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.262169 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:53:41.262184 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.262176 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:53:41.267864 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.267846 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:53:41.267864 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.267861 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:53:41.277518 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.277501 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6r99z\"" Apr 17 07:53:41.280045 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.280026 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f6tcx"] Apr 17 07:53:41.306829 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.306793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0aec996c-f5f7-4af3-8685-2febd74582db-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-22l88\" (UID: \"0aec996c-f5f7-4af3-8685-2febd74582db\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" Apr 17 07:53:41.306953 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.306875 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0aec996c-f5f7-4af3-8685-2febd74582db-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-22l88\" (UID: \"0aec996c-f5f7-4af3-8685-2febd74582db\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" Apr 17 07:53:41.306953 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.306943 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pnj\" (UniqueName: \"kubernetes.io/projected/7d7c2a20-730c-49bd-ac08-496b110637bd-kube-api-access-67pnj\") pod \"downloads-6bcc868b7-tw9dk\" (UID: \"7d7c2a20-730c-49bd-ac08-496b110637bd\") " pod="openshift-console/downloads-6bcc868b7-tw9dk" Apr 17 07:53:41.307696 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.307595 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0aec996c-f5f7-4af3-8685-2febd74582db-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-22l88\" (UID: \"0aec996c-f5f7-4af3-8685-2febd74582db\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" Apr 17 07:53:41.309363 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.309346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0aec996c-f5f7-4af3-8685-2febd74582db-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-22l88\" (UID: \"0aec996c-f5f7-4af3-8685-2febd74582db\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" Apr 17 07:53:41.408096 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.408010 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-data-volume\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.408096 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.408069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7rw\" (UniqueName: \"kubernetes.io/projected/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-kube-api-access-hn7rw\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.408273 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.408132 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.408273 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.408184 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.408273 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.408247 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-crio-socket\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.408414 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.408278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67pnj\" (UniqueName: \"kubernetes.io/projected/7d7c2a20-730c-49bd-ac08-496b110637bd-kube-api-access-67pnj\") pod \"downloads-6bcc868b7-tw9dk\" (UID: \"7d7c2a20-730c-49bd-ac08-496b110637bd\") " pod="openshift-console/downloads-6bcc868b7-tw9dk" Apr 17 07:53:41.409994 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.409977 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" Apr 17 07:53:41.420456 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.420431 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pnj\" (UniqueName: \"kubernetes.io/projected/7d7c2a20-730c-49bd-ac08-496b110637bd-kube-api-access-67pnj\") pod \"downloads-6bcc868b7-tw9dk\" (UID: \"7d7c2a20-730c-49bd-ac08-496b110637bd\") " pod="openshift-console/downloads-6bcc868b7-tw9dk" Apr 17 07:53:41.509366 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.509338 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-crio-socket\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.509520 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.509395 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-data-volume\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.509520 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.509485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-crio-socket\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.509637 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.509590 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7rw\" (UniqueName: \"kubernetes.io/projected/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-kube-api-access-hn7rw\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.509637 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.509627 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.509735 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.509664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.509735 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.509720 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-data-volume\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.510129 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.510108 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.512154 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.512132 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.521130 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.521099 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7rw\" (UniqueName: \"kubernetes.io/projected/fab94ad2-1267-48c8-9ec7-3160b92c3f4c-kube-api-access-hn7rw\") pod \"insights-runtime-extractor-f6tcx\" (UID: \"fab94ad2-1267-48c8-9ec7-3160b92c3f4c\") " pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.528001 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.527979 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tw9dk" Apr 17 07:53:41.536627 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.536607 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-22l88"] Apr 17 07:53:41.539692 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:41.539665 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aec996c_f5f7_4af3_8685_2febd74582db.slice/crio-390a8f9a53c71d25c084f86329647e6712848f39b20c6496983453b87bb2bcb9 WatchSource:0}: Error finding container 390a8f9a53c71d25c084f86329647e6712848f39b20c6496983453b87bb2bcb9: Status 404 returned error can't find the container with id 390a8f9a53c71d25c084f86329647e6712848f39b20c6496983453b87bb2bcb9 Apr 17 07:53:41.568259 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.568114 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f6tcx" Apr 17 07:53:41.664668 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.664584 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tw9dk"] Apr 17 07:53:41.668157 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:41.668129 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7c2a20_730c_49bd_ac08_496b110637bd.slice/crio-abf4b225a7a0864ba124878c5ba78276a51025fa457959349c7d013aeec664da WatchSource:0}: Error finding container abf4b225a7a0864ba124878c5ba78276a51025fa457959349c7d013aeec664da: Status 404 returned error can't find the container with id abf4b225a7a0864ba124878c5ba78276a51025fa457959349c7d013aeec664da Apr 17 07:53:41.699981 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:41.699955 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f6tcx"] Apr 17 07:53:41.702916 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:41.702888 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab94ad2_1267_48c8_9ec7_3160b92c3f4c.slice/crio-924e76dc37fea56fc7905a9c873281a4a87a274c4128915a1ffb663ba78d7901 WatchSource:0}: Error finding container 924e76dc37fea56fc7905a9c873281a4a87a274c4128915a1ffb663ba78d7901: Status 404 returned error can't find the container with id 924e76dc37fea56fc7905a9c873281a4a87a274c4128915a1ffb663ba78d7901 Apr 17 07:53:42.215820 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.215780 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:42.222567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.222530 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed2baca-4a17-4906-9829-56274b0374d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m4cqz\" (UID: \"6ed2baca-4a17-4906-9829-56274b0374d5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:42.249946 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.249917 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fglcm\"" Apr 17 07:53:42.258442 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.258173 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" Apr 17 07:53:42.264953 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.264922 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" event={"ID":"0aec996c-f5f7-4af3-8685-2febd74582db","Type":"ContainerStarted","Data":"390a8f9a53c71d25c084f86329647e6712848f39b20c6496983453b87bb2bcb9"} Apr 17 07:53:42.266799 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.266772 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6tcx" event={"ID":"fab94ad2-1267-48c8-9ec7-3160b92c3f4c","Type":"ContainerStarted","Data":"1908d2eed579a9470bf4955b6dda3ca4984ca90a59fe79186ede9fc3edd4c31b"} Apr 17 07:53:42.266921 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.266805 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6tcx" event={"ID":"fab94ad2-1267-48c8-9ec7-3160b92c3f4c","Type":"ContainerStarted","Data":"924e76dc37fea56fc7905a9c873281a4a87a274c4128915a1ffb663ba78d7901"} Apr 17 07:53:42.267947 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.267916 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tw9dk" event={"ID":"7d7c2a20-730c-49bd-ac08-496b110637bd","Type":"ContainerStarted","Data":"abf4b225a7a0864ba124878c5ba78276a51025fa457959349c7d013aeec664da"} Apr 17 07:53:42.754366 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:42.754172 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz"] Apr 17 07:53:42.756225 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:42.756197 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ed2baca_4a17_4906_9829_56274b0374d5.slice/crio-9f8190a2ccd7309a4eb6ba23db59ca2f078e90da12df031140ea011ce9e80396 WatchSource:0}: Error finding container 9f8190a2ccd7309a4eb6ba23db59ca2f078e90da12df031140ea011ce9e80396: Status 404 returned error can't find the container with id 9f8190a2ccd7309a4eb6ba23db59ca2f078e90da12df031140ea011ce9e80396 Apr 17 07:53:43.273317 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.273082 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" event={"ID":"0aec996c-f5f7-4af3-8685-2febd74582db","Type":"ContainerStarted","Data":"dcc8ba2d02ecb8037076c69bffa21139703724756bab8f6e2ba47827c27252cc"} Apr 17 07:53:43.274465 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.274405 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" event={"ID":"6ed2baca-4a17-4906-9829-56274b0374d5","Type":"ContainerStarted","Data":"9f8190a2ccd7309a4eb6ba23db59ca2f078e90da12df031140ea011ce9e80396"} Apr 17 07:53:43.276701 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.276673 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6tcx" event={"ID":"fab94ad2-1267-48c8-9ec7-3160b92c3f4c","Type":"ContainerStarted","Data":"043cb3c49015b7b1075224140475fddfecc982d110a9c1bcb97c4c3c6acdaf09"} Apr 17 07:53:43.289023 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.288964 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-22l88" podStartSLOduration=1.195249247 podStartE2EDuration="2.288946079s" podCreationTimestamp="2026-04-17 07:53:41 +0000 UTC" firstStartedPulling="2026-04-17 07:53:41.542166051 +0000 UTC m=+140.345116769" lastFinishedPulling="2026-04-17 07:53:42.635862868 +0000 UTC m=+141.438813601" observedRunningTime="2026-04-17 07:53:43.287004082 +0000 UTC m=+142.089954826" watchObservedRunningTime="2026-04-17 07:53:43.288946079 +0000 UTC m=+142.091896822" Apr 17 07:53:43.923845 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.923811 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b8db6fcbc-qdtzr"] Apr 17 07:53:43.928559 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.928530 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:43.931108 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.930905 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 07:53:43.932312 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.932031 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 07:53:43.932312 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.932118 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gf4pd\"" Apr 17 07:53:43.932312 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.932274 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 07:53:43.932535 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.932379 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 07:53:43.932535 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.932492 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 07:53:43.943184 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:43.943137 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b8db6fcbc-qdtzr"] Apr 17 07:53:44.033397 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.033327 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-oauth-config\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.033571 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.033453 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-config\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.033571 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.033486 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-service-ca\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.033571 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.033532 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmwfk\" (UniqueName: \"kubernetes.io/projected/4cb4407b-fac2-416d-abdd-9452c01b5c16-kube-api-access-qmwfk\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.033741 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.033590 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-oauth-serving-cert\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.033741 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.033625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-serving-cert\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.134894 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.134862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-serving-cert\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.135068 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.134944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-oauth-config\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.135068 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.134983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-config\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.135068 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.135006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-service-ca\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.135068 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.135040 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmwfk\" (UniqueName: \"kubernetes.io/projected/4cb4407b-fac2-416d-abdd-9452c01b5c16-kube-api-access-qmwfk\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.135280 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.135081 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-oauth-serving-cert\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.135840 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.135813 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-oauth-serving-cert\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.135942 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.135895 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-service-ca\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.136752 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.136722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-config\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.138516 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.138493 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-oauth-config\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.139337 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.139314 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-serving-cert\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.143663 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.143638 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmwfk\" (UniqueName: \"kubernetes.io/projected/4cb4407b-fac2-416d-abdd-9452c01b5c16-kube-api-access-qmwfk\") pod \"console-7b8db6fcbc-qdtzr\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.246897 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.246868 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:44.283886 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.283469 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6tcx" event={"ID":"fab94ad2-1267-48c8-9ec7-3160b92c3f4c","Type":"ContainerStarted","Data":"4df5d9edbdf90f9772deff324b1bb54964b7a50efd97195811ee1948e000bdd3"} Apr 17 07:53:44.302274 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.302223 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-f6tcx" podStartSLOduration=0.923733527 podStartE2EDuration="3.302204176s" podCreationTimestamp="2026-04-17 07:53:41 +0000 UTC" firstStartedPulling="2026-04-17 07:53:41.76296287 +0000 UTC m=+140.565913588" lastFinishedPulling="2026-04-17 07:53:44.141433505 +0000 UTC m=+142.944384237" observedRunningTime="2026-04-17 07:53:44.301281022 +0000 UTC m=+143.104231758" watchObservedRunningTime="2026-04-17 07:53:44.302204176 +0000 UTC m=+143.105154928" Apr 17 07:53:44.399409 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:44.398742 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b8db6fcbc-qdtzr"] Apr 17 07:53:44.821739 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:44.821702 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cb4407b_fac2_416d_abdd_9452c01b5c16.slice/crio-259906c7ca68fa2da806b3bd614c6bb29dd74401236d8ac5aea17c0d0c6e17f2 WatchSource:0}: Error finding container 259906c7ca68fa2da806b3bd614c6bb29dd74401236d8ac5aea17c0d0c6e17f2: Status 404 returned error can't find the container with id 259906c7ca68fa2da806b3bd614c6bb29dd74401236d8ac5aea17c0d0c6e17f2 Apr 17 07:53:45.287877 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.287837 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" event={"ID":"6ed2baca-4a17-4906-9829-56274b0374d5","Type":"ContainerStarted","Data":"8bb85172046f780fb9af133784722c9b2f3c514d82d0d276cb36b79ba0406759"} Apr 17 07:53:45.289412 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.289378 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8db6fcbc-qdtzr" event={"ID":"4cb4407b-fac2-416d-abdd-9452c01b5c16","Type":"ContainerStarted","Data":"259906c7ca68fa2da806b3bd614c6bb29dd74401236d8ac5aea17c0d0c6e17f2"} Apr 17 07:53:45.305600 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.304932 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m4cqz" podStartSLOduration=33.194700152 podStartE2EDuration="35.304919341s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="2026-04-17 07:53:42.758112112 +0000 UTC m=+141.561062831" lastFinishedPulling="2026-04-17 07:53:44.868331295 +0000 UTC m=+143.671282020" observedRunningTime="2026-04-17 07:53:45.303797292 +0000 UTC m=+144.106748046" watchObservedRunningTime="2026-04-17 07:53:45.304919341 +0000 UTC m=+144.107870082" Apr 17 07:53:45.370810 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.370779 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h"] Apr 17 07:53:45.373717 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.373701 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" Apr 17 07:53:45.376087 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.376036 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 07:53:45.376322 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.376302 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-fbttq\"" Apr 17 07:53:45.381111 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.381084 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h"] Apr 17 07:53:45.545140 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.545044 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c586da6e-7ea7-4dbc-bc58-bec868051781-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w6z6h\" (UID: \"c586da6e-7ea7-4dbc-bc58-bec868051781\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" Apr 17 07:53:45.646207 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.646160 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c586da6e-7ea7-4dbc-bc58-bec868051781-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w6z6h\" (UID: \"c586da6e-7ea7-4dbc-bc58-bec868051781\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" Apr 17 07:53:45.646402 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:45.646375 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 07:53:45.646494 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:45.646482 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c586da6e-7ea7-4dbc-bc58-bec868051781-tls-certificates podName:c586da6e-7ea7-4dbc-bc58-bec868051781 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:46.146459517 +0000 UTC m=+144.949410250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/c586da6e-7ea7-4dbc-bc58-bec868051781-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-w6z6h" (UID: "c586da6e-7ea7-4dbc-bc58-bec868051781") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 07:53:45.902662 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.902569 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-765d9fcf8-7tpsf"] Apr 17 07:53:45.905849 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.905828 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:45.914488 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.914257 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 07:53:45.915395 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:45.915373 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-765d9fcf8-7tpsf"] Apr 17 07:53:46.049864 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.049829 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-oauth-serving-cert\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.049864 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.049870 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-oauth-config\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.050079 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.049895 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b64ck\" (UniqueName: \"kubernetes.io/projected/1607537c-a1af-44e7-8796-bfced16b02f3-kube-api-access-b64ck\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.050079 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.049974 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-service-ca\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.050079 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.050030 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-serving-cert\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.050079 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.050062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-trusted-ca-bundle\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.050219 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.050108 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-console-config\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.150745 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.150713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-oauth-config\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.150745 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.150748 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b64ck\" (UniqueName: \"kubernetes.io/projected/1607537c-a1af-44e7-8796-bfced16b02f3-kube-api-access-b64ck\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.150978 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.150792 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-service-ca\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.150978 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.150856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-serving-cert\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.150978 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.150878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-trusted-ca-bundle\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.150978 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.150905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-console-config\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.151180 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.151107 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c586da6e-7ea7-4dbc-bc58-bec868051781-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w6z6h\" (UID: \"c586da6e-7ea7-4dbc-bc58-bec868051781\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" Apr 17 07:53:46.151180 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.151166 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-oauth-serving-cert\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.151866 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.151839 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-oauth-serving-cert\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.151866 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.151859 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-trusted-ca-bundle\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.151866 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.151862 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-service-ca\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.152099 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.151826 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-console-config\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.153428 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.153374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-oauth-config\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.153542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.153510 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-serving-cert\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.153709 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.153685 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c586da6e-7ea7-4dbc-bc58-bec868051781-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w6z6h\" (UID: \"c586da6e-7ea7-4dbc-bc58-bec868051781\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" Apr 17 07:53:46.158838 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.158813 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b64ck\" (UniqueName: \"kubernetes.io/projected/1607537c-a1af-44e7-8796-bfced16b02f3-kube-api-access-b64ck\") pod \"console-765d9fcf8-7tpsf\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.217722 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.217687 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:46.286022 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.283913 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" Apr 17 07:53:46.373513 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.373474 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-765d9fcf8-7tpsf"] Apr 17 07:53:46.376269 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:46.376235 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1607537c_a1af_44e7_8796_bfced16b02f3.slice/crio-b736e9af32de4a9e36c6d235778ede254b0e05e39ee1f7f74e1805b66ed4e801 WatchSource:0}: Error finding container b736e9af32de4a9e36c6d235778ede254b0e05e39ee1f7f74e1805b66ed4e801: Status 404 returned error can't find the container with id b736e9af32de4a9e36c6d235778ede254b0e05e39ee1f7f74e1805b66ed4e801 Apr 17 07:53:46.444150 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:46.444055 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h"] Apr 17 07:53:46.449263 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:46.449229 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc586da6e_7ea7_4dbc_bc58_bec868051781.slice/crio-8105e5e85e1b8e6d3ae25ec3330c3576146e4cc2e0ef633a0bbda543999f8736 WatchSource:0}: Error finding container 8105e5e85e1b8e6d3ae25ec3330c3576146e4cc2e0ef633a0bbda543999f8736: Status 404 returned error can't find the container with id 8105e5e85e1b8e6d3ae25ec3330c3576146e4cc2e0ef633a0bbda543999f8736 Apr 17 07:53:47.301750 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:47.301707 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" event={"ID":"c586da6e-7ea7-4dbc-bc58-bec868051781","Type":"ContainerStarted","Data":"8105e5e85e1b8e6d3ae25ec3330c3576146e4cc2e0ef633a0bbda543999f8736"} Apr 17 07:53:47.303182 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:47.303149 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-765d9fcf8-7tpsf" event={"ID":"1607537c-a1af-44e7-8796-bfced16b02f3","Type":"ContainerStarted","Data":"b736e9af32de4a9e36c6d235778ede254b0e05e39ee1f7f74e1805b66ed4e801"} Apr 17 07:53:49.311020 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:49.310977 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" event={"ID":"c586da6e-7ea7-4dbc-bc58-bec868051781","Type":"ContainerStarted","Data":"3f406b2a478617b7a90fe2077440c25a93c2c8bc2ae4befd7b4cd8f2d8e573f0"} Apr 17 07:53:49.311515 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:49.311356 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" Apr 17 07:53:49.312663 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:49.312600 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-765d9fcf8-7tpsf" event={"ID":"1607537c-a1af-44e7-8796-bfced16b02f3","Type":"ContainerStarted","Data":"abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5"} Apr 17 07:53:49.314362 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:49.314328 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8db6fcbc-qdtzr" event={"ID":"4cb4407b-fac2-416d-abdd-9452c01b5c16","Type":"ContainerStarted","Data":"16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646"} Apr 17 07:53:49.316462 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:49.316442 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" Apr 17 07:53:49.326760 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:49.326702 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6z6h" podStartSLOduration=1.945448749 podStartE2EDuration="4.326689575s" podCreationTimestamp="2026-04-17 07:53:45 +0000 UTC" firstStartedPulling="2026-04-17 07:53:46.451524439 +0000 UTC m=+145.254475161" lastFinishedPulling="2026-04-17 07:53:48.832765249 +0000 UTC m=+147.635715987" observedRunningTime="2026-04-17 07:53:49.326042937 +0000 UTC m=+148.128993679" watchObservedRunningTime="2026-04-17 07:53:49.326689575 +0000 UTC m=+148.129640310" Apr 17 07:53:49.341967 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:49.341430 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b8db6fcbc-qdtzr" podStartSLOduration=2.331691067 podStartE2EDuration="6.341413686s" podCreationTimestamp="2026-04-17 07:53:43 +0000 UTC" firstStartedPulling="2026-04-17 07:53:44.823714544 +0000 UTC m=+143.626665263" lastFinishedPulling="2026-04-17 07:53:48.833437149 +0000 UTC m=+147.636387882" observedRunningTime="2026-04-17 07:53:49.341126561 +0000 UTC m=+148.144077314" watchObservedRunningTime="2026-04-17 07:53:49.341413686 +0000 UTC m=+148.144364429" Apr 17 07:53:49.370506 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:49.370455 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-765d9fcf8-7tpsf" podStartSLOduration=1.910707576 podStartE2EDuration="4.370438119s" podCreationTimestamp="2026-04-17 07:53:45 +0000 UTC" firstStartedPulling="2026-04-17 07:53:46.380495194 +0000 UTC m=+145.183445917" lastFinishedPulling="2026-04-17 07:53:48.840225736 +0000 UTC m=+147.643176460" observedRunningTime="2026-04-17 07:53:49.369891577 +0000 UTC m=+148.172842319" watchObservedRunningTime="2026-04-17 07:53:49.370438119 +0000 UTC m=+148.173388860" Apr 17 07:53:50.431812 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.431777 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hkkmn"] Apr 17 07:53:50.435569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.435543 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.438164 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.438131 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:53:50.438370 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.438349 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 07:53:50.438546 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.438529 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 07:53:50.439348 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.439325 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-9q7m9\"" Apr 17 07:53:50.442323 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.442277 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hkkmn"] Apr 17 07:53:50.597097 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.597065 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxh54\" (UniqueName: \"kubernetes.io/projected/01f11896-aba0-4324-a913-4bc5ab88a7d4-kube-api-access-kxh54\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.597264 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.597116 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01f11896-aba0-4324-a913-4bc5ab88a7d4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.597264 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.597206 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01f11896-aba0-4324-a913-4bc5ab88a7d4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.597264 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.597259 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/01f11896-aba0-4324-a913-4bc5ab88a7d4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.698094 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.697999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01f11896-aba0-4324-a913-4bc5ab88a7d4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.698094 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.698072 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/01f11896-aba0-4324-a913-4bc5ab88a7d4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.698345 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.698122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxh54\" (UniqueName: \"kubernetes.io/projected/01f11896-aba0-4324-a913-4bc5ab88a7d4-kube-api-access-kxh54\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.698345 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.698175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01f11896-aba0-4324-a913-4bc5ab88a7d4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.698863 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.698815 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01f11896-aba0-4324-a913-4bc5ab88a7d4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.700943 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.700913 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01f11896-aba0-4324-a913-4bc5ab88a7d4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.701085 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.701043 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/01f11896-aba0-4324-a913-4bc5ab88a7d4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.706392 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.706357 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxh54\" (UniqueName: \"kubernetes.io/projected/01f11896-aba0-4324-a913-4bc5ab88a7d4-kube-api-access-kxh54\") pod \"prometheus-operator-5676c8c784-hkkmn\" (UID: \"01f11896-aba0-4324-a913-4bc5ab88a7d4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.749114 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.749074 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" Apr 17 07:53:50.890713 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:50.890684 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hkkmn"] Apr 17 07:53:50.893950 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:53:50.893921 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f11896_aba0_4324_a913_4bc5ab88a7d4.slice/crio-d3db39183078f8eb8a6498509e2716264b34ee75f331bff44e8e4946b04826d7 WatchSource:0}: Error finding container d3db39183078f8eb8a6498509e2716264b34ee75f331bff44e8e4946b04826d7: Status 404 returned error can't find the container with id d3db39183078f8eb8a6498509e2716264b34ee75f331bff44e8e4946b04826d7 Apr 17 07:53:51.321505 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:51.321467 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" event={"ID":"01f11896-aba0-4324-a913-4bc5ab88a7d4","Type":"ContainerStarted","Data":"d3db39183078f8eb8a6498509e2716264b34ee75f331bff44e8e4946b04826d7"} Apr 17 07:53:53.330344 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:53.330274 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" event={"ID":"01f11896-aba0-4324-a913-4bc5ab88a7d4","Type":"ContainerStarted","Data":"e174134c73227ff72283e72f54076ba63373c77de18dacc71700dc054d36120c"} Apr 17 07:53:53.330344 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:53.330337 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" event={"ID":"01f11896-aba0-4324-a913-4bc5ab88a7d4","Type":"ContainerStarted","Data":"9d34f65c3568d0161924770433680a7526ab9e38a85d4da0fcfac84a0c23f828"} Apr 17 07:53:53.346871 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:53.346822 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-hkkmn" podStartSLOduration=1.944178312 podStartE2EDuration="3.346807553s" podCreationTimestamp="2026-04-17 07:53:50 +0000 UTC" firstStartedPulling="2026-04-17 07:53:50.896257573 +0000 UTC m=+149.699208292" lastFinishedPulling="2026-04-17 07:53:52.298886797 +0000 UTC m=+151.101837533" observedRunningTime="2026-04-17 07:53:53.345414367 +0000 UTC m=+152.148365133" watchObservedRunningTime="2026-04-17 07:53:53.346807553 +0000 UTC m=+152.149758297" Apr 17 07:53:54.247123 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.247082 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:54.247346 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.247136 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:53:54.248674 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.248647 2567 patch_prober.go:28] interesting pod/console-7b8db6fcbc-qdtzr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" start-of-body= Apr 17 07:53:54.248795 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.248693 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7b8db6fcbc-qdtzr" podUID="4cb4407b-fac2-416d-abdd-9452c01b5c16" containerName="console" probeResult="failure" output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" Apr 17 07:53:54.767250 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.767212 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fdls8"] Apr 17 07:53:54.773820 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.773794 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.776511 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.776214 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:53:54.776511 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.776223 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:53:54.776680 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.776556 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s4jk5\"" Apr 17 07:53:54.776897 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.776803 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:53:54.785572 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.785473 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hvztr"] Apr 17 07:53:54.789939 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.789898 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.792190 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.792168 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 07:53:54.792722 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.792569 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 07:53:54.793089 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.792894 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 07:53:54.793089 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.792934 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-cgmcl\"" Apr 17 07:53:54.807355 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.807138 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hvztr"] Apr 17 07:53:54.833362 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833332 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.833484 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833418 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-textfile\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833484 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833449 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833606 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833484 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-wtmp\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833606 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833536 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqcz\" (UniqueName: \"kubernetes.io/projected/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-kube-api-access-qjqcz\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833702 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833606 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833702 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833646 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.833702 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833681 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-root\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833705 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-metrics-client-ca\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-sys\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833763 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-tls\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.833846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833790 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.833846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.834171 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833876 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.834171 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.833901 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spzvf\" (UniqueName: \"kubernetes.io/projected/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-api-access-spzvf\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.935124 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935066 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.935332 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935129 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.935332 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935160 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spzvf\" (UniqueName: \"kubernetes.io/projected/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-api-access-spzvf\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.935332 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935188 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.935332 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-textfile\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935332 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935267 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935332 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935315 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-wtmp\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935659 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqcz\" (UniqueName: \"kubernetes.io/projected/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-kube-api-access-qjqcz\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935659 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935384 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935659 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.935659 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935443 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-root\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935659 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-metrics-client-ca\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935659 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-sys\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935659 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-tls\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.935659 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.936078 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.935729 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.936132 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.936061 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.936182 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.936152 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-root\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.936315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-sys\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.936802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-metrics-client-ca\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.936856 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-textfile\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.936990 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-wtmp\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:54.937072 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:54.937131 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-tls podName:992a09a6-6ee8-42d1-b1cc-ddac80952b0d nodeName:}" failed. No retries permitted until 2026-04-17 07:53:55.437112878 +0000 UTC m=+154.240063603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-tls") pod "node-exporter-fdls8" (UID: "992a09a6-6ee8-42d1-b1cc-ddac80952b0d") : secret "node-exporter-tls" not found Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.937262 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:54.937381 2567 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 07:53:54.937486 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:54.937432 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-tls podName:2bb4af48-7fc1-4da0-96dd-46c44953d2d1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:55.437415982 +0000 UTC m=+154.240366718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-hvztr" (UID: "2bb4af48-7fc1-4da0-96dd-46c44953d2d1") : secret "kube-state-metrics-tls" not found Apr 17 07:53:54.938971 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.938931 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.941312 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.940750 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:54.943648 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.943613 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.948306 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.946601 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spzvf\" (UniqueName: \"kubernetes.io/projected/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-api-access-spzvf\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:54.948306 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:54.947075 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqcz\" (UniqueName: \"kubernetes.io/projected/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-kube-api-access-qjqcz\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:55.451002 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.450943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:55.451182 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.451020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-tls\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:55.454221 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.454186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/992a09a6-6ee8-42d1-b1cc-ddac80952b0d-node-exporter-tls\") pod \"node-exporter-fdls8\" (UID: \"992a09a6-6ee8-42d1-b1cc-ddac80952b0d\") " pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:55.454698 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.454673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bb4af48-7fc1-4da0-96dd-46c44953d2d1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hvztr\" (UID: \"2bb4af48-7fc1-4da0-96dd-46c44953d2d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:55.687027 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.686991 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fdls8" Apr 17 07:53:55.707926 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.707843 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" Apr 17 07:53:55.884711 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.884678 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:53:55.891779 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.891745 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.896395 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.895997 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 07:53:55.896395 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896048 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 07:53:55.896395 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896113 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-72zhp\"" Apr 17 07:53:55.896395 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896159 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 07:53:55.896395 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896234 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 07:53:55.896395 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896255 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 07:53:55.896395 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896329 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 07:53:55.896843 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896432 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 07:53:55.896843 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896465 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 07:53:55.896843 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.896506 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 07:53:55.905156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.905131 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:53:55.954392 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954577 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954405 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954577 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954434 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-web-config\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954577 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954577 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954509 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-config-out\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954779 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954531 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954779 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954728 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954854 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954854 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954814 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-config-volume\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954854 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954842 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2fk\" (UniqueName: \"kubernetes.io/projected/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-kube-api-access-6q2fk\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954991 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954889 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954991 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954942 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:55.954991 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:55.954975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.055451 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.055403 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.055650 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.055455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.055650 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.055490 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.055650 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.055517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.055952 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.055924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-web-config\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056045 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.055969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056045 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.055995 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-config-out\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056045 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.056018 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056193 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.056102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056193 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.056151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056193 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.056181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-config-volume\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056378 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.056205 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2fk\" (UniqueName: \"kubernetes.io/projected/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-kube-api-access-6q2fk\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056378 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.056265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.056492 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.056401 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.057153 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.057127 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.058611 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.058586 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.059340 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.059226 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.059856 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.059822 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.060326 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.060280 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.061357 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.061339 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-config-out\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.061567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.061546 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.061963 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.061939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-config-volume\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.063153 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.063134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.063323 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.063138 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-web-config\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.063323 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.063173 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.066594 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.066572 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2fk\" (UniqueName: \"kubernetes.io/projected/6d4a32f9-181c-4398-82b0-3b1cf0ab3e87-kube-api-access-6q2fk\") pod \"alertmanager-main-0\" (UID: \"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.206783 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.206753 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:53:56.218661 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.218633 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:56.218782 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.218679 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:53:56.220249 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.220213 2567 patch_prober.go:28] interesting pod/console-765d9fcf8-7tpsf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" start-of-body= Apr 17 07:53:56.220374 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:56.220263 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-765d9fcf8-7tpsf" podUID="1607537c-a1af-44e7-8796-bfced16b02f3" containerName="console" probeResult="failure" output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" Apr 17 07:53:57.535159 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:57.535113 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mspd5" podUID="5e3cf222-71f9-4a25-88bb-37c528ac2994" Apr 17 07:53:57.542237 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:53:57.542197 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-sm78q" podUID="fcbda289-b762-45ea-ba60-5188e612db63" Apr 17 07:53:57.757583 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.757548 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-547b55c77c-mkwgp"] Apr 17 07:53:57.771076 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.770164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.771076 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.771041 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-547b55c77c-mkwgp"] Apr 17 07:53:57.774379 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.774349 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 07:53:57.774636 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.774603 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-lwpxh\"" Apr 17 07:53:57.774852 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.774831 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 07:53:57.774852 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.774845 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 07:53:57.775026 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.775008 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 07:53:57.775086 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.775012 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 07:53:57.775125 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.775095 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-a0qpl0elqmtig\"" Apr 17 07:53:57.873310 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.873204 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.873310 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.873255 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed049ead-6ee3-4a75-945e-6168dd530b2c-metrics-client-ca\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.873544 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.873358 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj94d\" (UniqueName: \"kubernetes.io/projected/ed049ead-6ee3-4a75-945e-6168dd530b2c-kube-api-access-sj94d\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.873544 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.873417 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.873544 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.873443 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-grpc-tls\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.873544 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.873463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.873743 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.873591 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-tls\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.874607 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.873634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.975262 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.975226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.975262 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.975267 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed049ead-6ee3-4a75-945e-6168dd530b2c-metrics-client-ca\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.975521 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.975310 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj94d\" (UniqueName: \"kubernetes.io/projected/ed049ead-6ee3-4a75-945e-6168dd530b2c-kube-api-access-sj94d\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.975521 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.975365 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.975521 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.975400 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-grpc-tls\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.975521 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.975430 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.975758 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.975526 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-tls\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.975758 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.975556 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.976089 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.976057 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed049ead-6ee3-4a75-945e-6168dd530b2c-metrics-client-ca\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.978573 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.978522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.978949 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.978720 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.978949 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.978798 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-grpc-tls\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.979078 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.978987 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.979078 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.979001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-tls\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.979078 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.979048 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ed049ead-6ee3-4a75-945e-6168dd530b2c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:57.984893 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:57.984872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj94d\" (UniqueName: \"kubernetes.io/projected/ed049ead-6ee3-4a75-945e-6168dd530b2c-kube-api-access-sj94d\") pod \"thanos-querier-547b55c77c-mkwgp\" (UID: \"ed049ead-6ee3-4a75-945e-6168dd530b2c\") " pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:58.084989 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:58.084950 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:53:58.358879 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:58.358849 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:53:58.358879 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:58.358864 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mspd5" Apr 17 07:53:59.169871 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.169782 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c648db48d-fm8f7"] Apr 17 07:53:59.174867 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.174844 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.177307 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.177266 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:53:59.177436 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.177267 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 07:53:59.177436 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.177266 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-8wfx7\"" Apr 17 07:53:59.177436 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.177348 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 07:53:59.178701 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.178676 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 07:53:59.178701 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.178686 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9kkc7ke8ju6vb\"" Apr 17 07:53:59.180918 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.180891 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c648db48d-fm8f7"] Apr 17 07:53:59.186479 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.186459 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmfzk\" (UniqueName: \"kubernetes.io/projected/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-kube-api-access-dmfzk\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.186579 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.186511 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-metrics-server-audit-profiles\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.186657 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.186620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-client-ca-bundle\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.186752 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.186693 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.186801 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.186785 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-audit-log\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.186838 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.186817 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-secret-metrics-server-tls\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.186882 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.186839 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-secret-metrics-server-client-certs\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.288086 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.288047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-audit-log\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.288279 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.288108 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-secret-metrics-server-tls\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.288279 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.288137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-secret-metrics-server-client-certs\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.288279 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.288171 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmfzk\" (UniqueName: \"kubernetes.io/projected/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-kube-api-access-dmfzk\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.288279 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.288202 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-metrics-server-audit-profiles\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.288521 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.288307 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-client-ca-bundle\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.288521 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.288360 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.288621 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.288534 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-audit-log\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.289249 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.289196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.289760 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.289736 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-metrics-server-audit-profiles\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.291129 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.291102 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-secret-metrics-server-client-certs\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.291340 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.291316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-client-ca-bundle\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.291438 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.291382 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-secret-metrics-server-tls\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.297276 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.297252 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmfzk\" (UniqueName: \"kubernetes.io/projected/29d6bf46-15ba-4280-8c5e-c80fe3427b1d-kube-api-access-dmfzk\") pod \"metrics-server-7c648db48d-fm8f7\" (UID: \"29d6bf46-15ba-4280-8c5e-c80fe3427b1d\") " pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:53:59.489472 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:53:59.489386 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:54:00.285786 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:54:00.285750 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod992a09a6_6ee8_42d1_b1cc_ddac80952b0d.slice/crio-1ec35efc2127da37f73c8cef2c95b011b5a1541266338ee216eeae93ed3c7a10 WatchSource:0}: Error finding container 1ec35efc2127da37f73c8cef2c95b011b5a1541266338ee216eeae93ed3c7a10: Status 404 returned error can't find the container with id 1ec35efc2127da37f73c8cef2c95b011b5a1541266338ee216eeae93ed3c7a10 Apr 17 07:54:00.380979 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.380935 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fdls8" event={"ID":"992a09a6-6ee8-42d1-b1cc-ddac80952b0d","Type":"ContainerStarted","Data":"1ec35efc2127da37f73c8cef2c95b011b5a1541266338ee216eeae93ed3c7a10"} Apr 17 07:54:00.448374 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.448238 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hvztr"] Apr 17 07:54:00.478523 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.478483 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:54:00.480345 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:54:00.480313 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d4a32f9_181c_4398_82b0_3b1cf0ab3e87.slice/crio-40d8136580828a7e8fa71d071c9bf4603be5c992f91ce6932853ec4bd69b88b9 WatchSource:0}: Error finding container 40d8136580828a7e8fa71d071c9bf4603be5c992f91ce6932853ec4bd69b88b9: Status 404 returned error can't find the container with id 40d8136580828a7e8fa71d071c9bf4603be5c992f91ce6932853ec4bd69b88b9 Apr 17 07:54:00.508852 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.508824 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b8db6fcbc-qdtzr"] Apr 17 07:54:00.539245 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.539157 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-799c8767ff-pld6g"] Apr 17 07:54:00.542974 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.542952 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.550694 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.550673 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799c8767ff-pld6g"] Apr 17 07:54:00.602917 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.602865 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-service-ca\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.602917 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.602915 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hzhn\" (UniqueName: \"kubernetes.io/projected/3f756eab-34b4-41f3-abc6-e71fe10ea19d-kube-api-access-6hzhn\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.603134 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.602952 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-oauth-serving-cert\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.603134 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.603048 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-serving-cert\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.603224 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.603139 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-config\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.603224 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.603165 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-trusted-ca-bundle\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.603224 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.603208 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-oauth-config\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.699419 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.699383 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c648db48d-fm8f7"] Apr 17 07:54:00.703695 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.703664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-service-ca\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.703811 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.703712 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hzhn\" (UniqueName: \"kubernetes.io/projected/3f756eab-34b4-41f3-abc6-e71fe10ea19d-kube-api-access-6hzhn\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.703811 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.703761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-oauth-serving-cert\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.703925 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.703812 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-serving-cert\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.703925 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.703907 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-config\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.704028 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.703934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-trusted-ca-bundle\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.704028 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.703984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-oauth-config\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.704447 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.704421 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-service-ca\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.704724 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:54:00.704697 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29d6bf46_15ba_4280_8c5e_c80fe3427b1d.slice/crio-410f1cfbb57417976b0838620e14e6e4a883d73e934897f80658322b21c9e4c8 WatchSource:0}: Error finding container 410f1cfbb57417976b0838620e14e6e4a883d73e934897f80658322b21c9e4c8: Status 404 returned error can't find the container with id 410f1cfbb57417976b0838620e14e6e4a883d73e934897f80658322b21c9e4c8 Apr 17 07:54:00.705880 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.704995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-config\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.705880 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.705048 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-oauth-serving-cert\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.712850 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.712797 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-oauth-config\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.713816 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.713790 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-trusted-ca-bundle\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.716064 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.715636 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-serving-cert\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.716064 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.715785 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-547b55c77c-mkwgp"] Apr 17 07:54:00.720755 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:54:00.720331 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded049ead_6ee3_4a75_945e_6168dd530b2c.slice/crio-ad144e3d8c0683cbf0285f2d6a5365d23b08647537b2987f2a4d93c0ef37c69c WatchSource:0}: Error finding container ad144e3d8c0683cbf0285f2d6a5365d23b08647537b2987f2a4d93c0ef37c69c: Status 404 returned error can't find the container with id ad144e3d8c0683cbf0285f2d6a5365d23b08647537b2987f2a4d93c0ef37c69c Apr 17 07:54:00.721862 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.721727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hzhn\" (UniqueName: \"kubernetes.io/projected/3f756eab-34b4-41f3-abc6-e71fe10ea19d-kube-api-access-6hzhn\") pod \"console-799c8767ff-pld6g\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:00.853079 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:00.852851 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:01.031732 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.030377 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799c8767ff-pld6g"] Apr 17 07:54:01.206263 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:54:01.206171 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f756eab_34b4_41f3_abc6_e71fe10ea19d.slice/crio-4c5506c4d6b4f8d4df939eb376344bfd567ac27e961f99a1f397fca3349b9317 WatchSource:0}: Error finding container 4c5506c4d6b4f8d4df939eb376344bfd567ac27e961f99a1f397fca3349b9317: Status 404 returned error can't find the container with id 4c5506c4d6b4f8d4df939eb376344bfd567ac27e961f99a1f397fca3349b9317 Apr 17 07:54:01.388792 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.388170 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tw9dk" event={"ID":"7d7c2a20-730c-49bd-ac08-496b110637bd","Type":"ContainerStarted","Data":"4d14ff5fda6140c6c169586addd103f1bedb413a539dc710c2aab07eb973813b"} Apr 17 07:54:01.389439 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.389156 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-tw9dk" Apr 17 07:54:01.391815 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.391783 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" event={"ID":"2bb4af48-7fc1-4da0-96dd-46c44953d2d1","Type":"ContainerStarted","Data":"0f0f8a75b07bae0e9cd0dbfcf061bbf227b307f8a4437fbc60be622d171c81ab"} Apr 17 07:54:01.395272 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.395224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" event={"ID":"ed049ead-6ee3-4a75-945e-6168dd530b2c","Type":"ContainerStarted","Data":"ad144e3d8c0683cbf0285f2d6a5365d23b08647537b2987f2a4d93c0ef37c69c"} Apr 17 07:54:01.396843 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.396816 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-tw9dk" Apr 17 07:54:01.398277 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.397757 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fdls8" event={"ID":"992a09a6-6ee8-42d1-b1cc-ddac80952b0d","Type":"ContainerStarted","Data":"a202948a7ad1614a9cba1fa61bc913045cbf8116267a32af56242cbbddb14866"} Apr 17 07:54:01.400363 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.400114 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799c8767ff-pld6g" event={"ID":"3f756eab-34b4-41f3-abc6-e71fe10ea19d","Type":"ContainerStarted","Data":"a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1"} Apr 17 07:54:01.400363 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.400147 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799c8767ff-pld6g" event={"ID":"3f756eab-34b4-41f3-abc6-e71fe10ea19d","Type":"ContainerStarted","Data":"4c5506c4d6b4f8d4df939eb376344bfd567ac27e961f99a1f397fca3349b9317"} Apr 17 07:54:01.401445 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.401418 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" event={"ID":"29d6bf46-15ba-4280-8c5e-c80fe3427b1d","Type":"ContainerStarted","Data":"410f1cfbb57417976b0838620e14e6e4a883d73e934897f80658322b21c9e4c8"} Apr 17 07:54:01.403972 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.403232 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87","Type":"ContainerStarted","Data":"40d8136580828a7e8fa71d071c9bf4603be5c992f91ce6932853ec4bd69b88b9"} Apr 17 07:54:01.409195 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.408451 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-tw9dk" podStartSLOduration=1.673922316 podStartE2EDuration="20.408435175s" podCreationTimestamp="2026-04-17 07:53:41 +0000 UTC" firstStartedPulling="2026-04-17 07:53:41.670791774 +0000 UTC m=+140.473742509" lastFinishedPulling="2026-04-17 07:54:00.405304647 +0000 UTC m=+159.208255368" observedRunningTime="2026-04-17 07:54:01.407982206 +0000 UTC m=+160.210932942" watchObservedRunningTime="2026-04-17 07:54:01.408435175 +0000 UTC m=+160.211385917" Apr 17 07:54:01.468141 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:01.468038 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-799c8767ff-pld6g" podStartSLOduration=1.468018054 podStartE2EDuration="1.468018054s" podCreationTimestamp="2026-04-17 07:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:01.467182806 +0000 UTC m=+160.270133548" watchObservedRunningTime="2026-04-17 07:54:01.468018054 +0000 UTC m=+160.270968797" Apr 17 07:54:02.424891 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.424850 2567 generic.go:358] "Generic (PLEG): container finished" podID="992a09a6-6ee8-42d1-b1cc-ddac80952b0d" containerID="a202948a7ad1614a9cba1fa61bc913045cbf8116267a32af56242cbbddb14866" exitCode=0 Apr 17 07:54:02.426460 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.425102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:54:02.426584 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.426546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:54:02.427360 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.426331 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fdls8" event={"ID":"992a09a6-6ee8-42d1-b1cc-ddac80952b0d","Type":"ContainerDied","Data":"a202948a7ad1614a9cba1fa61bc913045cbf8116267a32af56242cbbddb14866"} Apr 17 07:54:02.428621 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.428595 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcbda289-b762-45ea-ba60-5188e612db63-cert\") pod \"ingress-canary-sm78q\" (UID: \"fcbda289-b762-45ea-ba60-5188e612db63\") " pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:54:02.430959 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.430901 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e3cf222-71f9-4a25-88bb-37c528ac2994-metrics-tls\") pod \"dns-default-mspd5\" (UID: \"5e3cf222-71f9-4a25-88bb-37c528ac2994\") " pod="openshift-dns/dns-default-mspd5" Apr 17 07:54:02.563257 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.563207 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mw88v\"" Apr 17 07:54:02.563619 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.563486 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9vrx5\"" Apr 17 07:54:02.570724 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.570690 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sm78q" Apr 17 07:54:02.571436 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:02.571401 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mspd5" Apr 17 07:54:04.304308 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.304254 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-765d9fcf8-7tpsf"] Apr 17 07:54:04.336034 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.334953 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-557d45fc88-bpgvk"] Apr 17 07:54:04.375736 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.375699 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557d45fc88-bpgvk"] Apr 17 07:54:04.375921 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.375854 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.447822 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.447791 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-oauth-serving-cert\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.447822 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.447828 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-trusted-ca-bundle\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.448051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.447926 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-serving-cert\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.448051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.447972 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-service-ca\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.448051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.448030 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cm2r\" (UniqueName: \"kubernetes.io/projected/1ff6c792-cd18-42a1-ba8b-7f472632c7df-kube-api-access-2cm2r\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.448195 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.448059 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-oauth-config\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.448195 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.448086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-config\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.549019 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.548982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-oauth-serving-cert\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.549153 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.549032 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-trusted-ca-bundle\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.549153 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.549113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-serving-cert\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.549153 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.549146 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-service-ca\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.549360 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.549186 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cm2r\" (UniqueName: \"kubernetes.io/projected/1ff6c792-cd18-42a1-ba8b-7f472632c7df-kube-api-access-2cm2r\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.549360 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.549239 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-oauth-config\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.549360 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.549270 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-config\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.549788 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.549765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-oauth-serving-cert\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.550006 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.549921 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-trusted-ca-bundle\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.550679 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.550654 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-config\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.550767 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.550675 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-service-ca\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.552143 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.552119 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-serving-cert\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.552240 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.552223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-oauth-config\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.557840 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.557790 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cm2r\" (UniqueName: \"kubernetes.io/projected/1ff6c792-cd18-42a1-ba8b-7f472632c7df-kube-api-access-2cm2r\") pod \"console-557d45fc88-bpgvk\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.688652 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.688622 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:04.756611 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.756581 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sm78q"] Apr 17 07:54:04.758375 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:54:04.758344 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcbda289_b762_45ea_ba60_5188e612db63.slice/crio-f2e3e5bbc8002316e5d687fd5a1c199d8b947ba038e84395788b467875326c17 WatchSource:0}: Error finding container f2e3e5bbc8002316e5d687fd5a1c199d8b947ba038e84395788b467875326c17: Status 404 returned error can't find the container with id f2e3e5bbc8002316e5d687fd5a1c199d8b947ba038e84395788b467875326c17 Apr 17 07:54:04.774748 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.774701 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mspd5"] Apr 17 07:54:04.897714 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:04.897504 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557d45fc88-bpgvk"] Apr 17 07:54:04.962828 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:54:04.962798 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ff6c792_cd18_42a1_ba8b_7f472632c7df.slice/crio-96fc0896882f625c512d4ed117072fe5b974341933eab56bc63feec3d5038f37 WatchSource:0}: Error finding container 96fc0896882f625c512d4ed117072fe5b974341933eab56bc63feec3d5038f37: Status 404 returned error can't find the container with id 96fc0896882f625c512d4ed117072fe5b974341933eab56bc63feec3d5038f37 Apr 17 07:54:05.443937 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.443903 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fdls8" event={"ID":"992a09a6-6ee8-42d1-b1cc-ddac80952b0d","Type":"ContainerStarted","Data":"c4650d44ad581146cae57cc7463786dc607dd2c652c7930a7bf6e5f15df49ea3"} Apr 17 07:54:05.444367 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.443948 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fdls8" event={"ID":"992a09a6-6ee8-42d1-b1cc-ddac80952b0d","Type":"ContainerStarted","Data":"bbd4a5b9addda2c25f03bfe950284e0ac8234a34c7176ed75177f9799eedeee4"} Apr 17 07:54:05.446626 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.446563 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557d45fc88-bpgvk" event={"ID":"1ff6c792-cd18-42a1-ba8b-7f472632c7df","Type":"ContainerStarted","Data":"52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6"} Apr 17 07:54:05.446626 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.446598 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557d45fc88-bpgvk" event={"ID":"1ff6c792-cd18-42a1-ba8b-7f472632c7df","Type":"ContainerStarted","Data":"96fc0896882f625c512d4ed117072fe5b974341933eab56bc63feec3d5038f37"} Apr 17 07:54:05.450862 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.450831 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" event={"ID":"29d6bf46-15ba-4280-8c5e-c80fe3427b1d","Type":"ContainerStarted","Data":"0b5b1539baa38755ed65578e8d56c04c09b6d3d7f42a9bfe1d263600cce4b846"} Apr 17 07:54:05.454051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.453954 2567 generic.go:358] "Generic (PLEG): container finished" podID="6d4a32f9-181c-4398-82b0-3b1cf0ab3e87" containerID="eb7e4332f79c380d66a48432081c266a0065038a39e965054ff40c7f1e8ceedf" exitCode=0 Apr 17 07:54:05.454172 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.454048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87","Type":"ContainerDied","Data":"eb7e4332f79c380d66a48432081c266a0065038a39e965054ff40c7f1e8ceedf"} Apr 17 07:54:05.459961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.457790 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" event={"ID":"2bb4af48-7fc1-4da0-96dd-46c44953d2d1","Type":"ContainerStarted","Data":"63a2987e50596dfc4e0d2270d58ffd714d277a5530d99627e27c532f71b3a6eb"} Apr 17 07:54:05.459961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.457835 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" event={"ID":"2bb4af48-7fc1-4da0-96dd-46c44953d2d1","Type":"ContainerStarted","Data":"e2dfe0f923d1a697330efbb4dfceedf507028197a93270a0c1c6792c9957201c"} Apr 17 07:54:05.459961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.457851 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" event={"ID":"2bb4af48-7fc1-4da0-96dd-46c44953d2d1","Type":"ContainerStarted","Data":"7364e8693b554a6060b589c0acab5d8e8817ecde2bf0d41895b0aa486b23c7bc"} Apr 17 07:54:05.459961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.459625 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mspd5" event={"ID":"5e3cf222-71f9-4a25-88bb-37c528ac2994","Type":"ContainerStarted","Data":"b13eeeb60afba46f22810a9b61c3aeba0e3898f646f72fbca52825f8ee44be73"} Apr 17 07:54:05.463875 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.463817 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" event={"ID":"ed049ead-6ee3-4a75-945e-6168dd530b2c","Type":"ContainerStarted","Data":"5fc7c27b5f9abdbd2e4722c26a57382250405772625e6d9f65b6bd2df0e26714"} Apr 17 07:54:05.463875 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.463845 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" event={"ID":"ed049ead-6ee3-4a75-945e-6168dd530b2c","Type":"ContainerStarted","Data":"719f587f5a5ac4e10a50f550c2f30861f3ac1e0b1947c8707d983a4c76b1b891"} Apr 17 07:54:05.465892 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.465856 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sm78q" event={"ID":"fcbda289-b762-45ea-ba60-5188e612db63","Type":"ContainerStarted","Data":"f2e3e5bbc8002316e5d687fd5a1c199d8b947ba038e84395788b467875326c17"} Apr 17 07:54:05.477022 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.476394 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fdls8" podStartSLOduration=10.50163097 podStartE2EDuration="11.476375447s" podCreationTimestamp="2026-04-17 07:53:54 +0000 UTC" firstStartedPulling="2026-04-17 07:54:00.28843948 +0000 UTC m=+159.091390206" lastFinishedPulling="2026-04-17 07:54:01.263183949 +0000 UTC m=+160.066134683" observedRunningTime="2026-04-17 07:54:05.474340294 +0000 UTC m=+164.277291036" watchObservedRunningTime="2026-04-17 07:54:05.476375447 +0000 UTC m=+164.279326169" Apr 17 07:54:05.550154 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.549511 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" podStartSLOduration=2.680452776 podStartE2EDuration="6.54948852s" podCreationTimestamp="2026-04-17 07:53:59 +0000 UTC" firstStartedPulling="2026-04-17 07:54:00.715300976 +0000 UTC m=+159.518251715" lastFinishedPulling="2026-04-17 07:54:04.584336737 +0000 UTC m=+163.387287459" observedRunningTime="2026-04-17 07:54:05.504162087 +0000 UTC m=+164.307112830" watchObservedRunningTime="2026-04-17 07:54:05.54948852 +0000 UTC m=+164.352439263" Apr 17 07:54:05.597222 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.596419 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-hvztr" podStartSLOduration=7.480801054 podStartE2EDuration="11.596399524s" podCreationTimestamp="2026-04-17 07:53:54 +0000 UTC" firstStartedPulling="2026-04-17 07:54:00.464875927 +0000 UTC m=+159.267826654" lastFinishedPulling="2026-04-17 07:54:04.580474401 +0000 UTC m=+163.383425124" observedRunningTime="2026-04-17 07:54:05.55064166 +0000 UTC m=+164.353592402" watchObservedRunningTime="2026-04-17 07:54:05.596399524 +0000 UTC m=+164.399350268" Apr 17 07:54:05.599095 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:05.598664 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-557d45fc88-bpgvk" podStartSLOduration=1.5986493689999999 podStartE2EDuration="1.598649369s" podCreationTimestamp="2026-04-17 07:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:05.595985528 +0000 UTC m=+164.398936271" watchObservedRunningTime="2026-04-17 07:54:05.598649369 +0000 UTC m=+164.401600111" Apr 17 07:54:06.477328 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:06.477213 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" event={"ID":"ed049ead-6ee3-4a75-945e-6168dd530b2c","Type":"ContainerStarted","Data":"437ff1d7ca7f5ce70bdd73b838cf2bcb1a9a7eacc0955e8aeef2b2183a3a6982"} Apr 17 07:54:09.499857 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:09.499769 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mspd5" event={"ID":"5e3cf222-71f9-4a25-88bb-37c528ac2994","Type":"ContainerStarted","Data":"b0521d4f4056fded6a1394c72020ae34c323b2e1566de34fe8ace52fc6dcdd60"} Apr 17 07:54:09.504721 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:09.502593 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" event={"ID":"ed049ead-6ee3-4a75-945e-6168dd530b2c","Type":"ContainerStarted","Data":"e57091b8b2c9219a5926c2a305965395ef365341778ce5f7a9d198f93506d002"} Apr 17 07:54:09.504721 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:09.504170 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sm78q" event={"ID":"fcbda289-b762-45ea-ba60-5188e612db63","Type":"ContainerStarted","Data":"b676b07e661a731756fccf3301ee7214a719e90bf6a88579b541b907105e99af"} Apr 17 07:54:09.506051 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:09.505974 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87","Type":"ContainerStarted","Data":"dcbab5f3bcb0b985d9f56520d7cd133af453c426d51cf0da4a39a9816d8ca41a"} Apr 17 07:54:09.577470 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:09.575231 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sm78q" podStartSLOduration=131.077178747 podStartE2EDuration="2m15.575211307s" podCreationTimestamp="2026-04-17 07:51:54 +0000 UTC" firstStartedPulling="2026-04-17 07:54:04.761105064 +0000 UTC m=+163.564055782" lastFinishedPulling="2026-04-17 07:54:09.259137611 +0000 UTC m=+168.062088342" observedRunningTime="2026-04-17 07:54:09.573234681 +0000 UTC m=+168.376185426" watchObservedRunningTime="2026-04-17 07:54:09.575211307 +0000 UTC m=+168.378162049" Apr 17 07:54:10.512726 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.512680 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" event={"ID":"ed049ead-6ee3-4a75-945e-6168dd530b2c","Type":"ContainerStarted","Data":"f6b217248308be62038ec06181b27d3072b86f953df21261e26f5b6944227f5f"} Apr 17 07:54:10.512726 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.512730 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" event={"ID":"ed049ead-6ee3-4a75-945e-6168dd530b2c","Type":"ContainerStarted","Data":"cf01d559801607b5a719b81b754a4fbd6b3ea3de81b374e01e2ca352e909966a"} Apr 17 07:54:10.513243 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.513009 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:54:10.516066 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.516024 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87","Type":"ContainerStarted","Data":"0236b8281dbe856611b32022eaffd9516c0e5753ea21003fd2cb88097866bdf0"} Apr 17 07:54:10.516066 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.516059 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87","Type":"ContainerStarted","Data":"de3181808a63b819e2dd59154567d52811ba4247be09da61a4664675918188eb"} Apr 17 07:54:10.516225 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.516074 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87","Type":"ContainerStarted","Data":"52718399b142c278741f87a5903390767ac188f5a07512377764eb26de5cfc1c"} Apr 17 07:54:10.516225 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.516087 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87","Type":"ContainerStarted","Data":"a6b0d28a57dc6992d489623df27bc0949891b48a910c2fbf8d271f1b372c458b"} Apr 17 07:54:10.516225 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.516099 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d4a32f9-181c-4398-82b0-3b1cf0ab3e87","Type":"ContainerStarted","Data":"e17ea794118357d1250f6b660716a09cb997bba341e824f0a8010e7102f30b11"} Apr 17 07:54:10.518269 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.518241 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mspd5" event={"ID":"5e3cf222-71f9-4a25-88bb-37c528ac2994","Type":"ContainerStarted","Data":"c7a10c10431bf87118027993ec803dcf4f1d1524aec27abb84401fbdc23af750"} Apr 17 07:54:10.519643 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.519620 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" Apr 17 07:54:10.543062 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.541584 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-547b55c77c-mkwgp" podStartSLOduration=5.011934028 podStartE2EDuration="13.541570388s" podCreationTimestamp="2026-04-17 07:53:57 +0000 UTC" firstStartedPulling="2026-04-17 07:54:00.722567717 +0000 UTC m=+159.525518439" lastFinishedPulling="2026-04-17 07:54:09.252204063 +0000 UTC m=+168.055154799" observedRunningTime="2026-04-17 07:54:10.539952033 +0000 UTC m=+169.342902787" watchObservedRunningTime="2026-04-17 07:54:10.541570388 +0000 UTC m=+169.344521128" Apr 17 07:54:10.571509 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.571438 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=6.799509414 podStartE2EDuration="15.571422821s" podCreationTimestamp="2026-04-17 07:53:55 +0000 UTC" firstStartedPulling="2026-04-17 07:54:00.482902683 +0000 UTC m=+159.285853406" lastFinishedPulling="2026-04-17 07:54:09.254816094 +0000 UTC m=+168.057766813" observedRunningTime="2026-04-17 07:54:10.569697162 +0000 UTC m=+169.372647916" watchObservedRunningTime="2026-04-17 07:54:10.571422821 +0000 UTC m=+169.374373562" Apr 17 07:54:10.593234 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.593173 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mspd5" podStartSLOduration=132.328912172 podStartE2EDuration="2m16.593150514s" podCreationTimestamp="2026-04-17 07:51:54 +0000 UTC" firstStartedPulling="2026-04-17 07:54:04.959700253 +0000 UTC m=+163.762650976" lastFinishedPulling="2026-04-17 07:54:09.223938596 +0000 UTC m=+168.026889318" observedRunningTime="2026-04-17 07:54:10.591354232 +0000 UTC m=+169.394304982" watchObservedRunningTime="2026-04-17 07:54:10.593150514 +0000 UTC m=+169.396101255" Apr 17 07:54:10.853330 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.853223 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:10.853481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.853336 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:10.854690 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.854663 2567 patch_prober.go:28] interesting pod/console-799c8767ff-pld6g container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.25:8443/health\": dial tcp 10.134.0.25:8443: connect: connection refused" start-of-body= Apr 17 07:54:10.854826 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:10.854706 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-799c8767ff-pld6g" podUID="3f756eab-34b4-41f3-abc6-e71fe10ea19d" containerName="console" probeResult="failure" output="Get \"https://10.134.0.25:8443/health\": dial tcp 10.134.0.25:8443: connect: connection refused" Apr 17 07:54:11.522607 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:11.522574 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mspd5" Apr 17 07:54:14.689816 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:14.689774 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:14.689816 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:14.689824 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:14.691141 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:14.691121 2567 patch_prober.go:28] interesting pod/console-557d45fc88-bpgvk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.26:8443/health\": dial tcp 10.134.0.26:8443: connect: connection refused" start-of-body= Apr 17 07:54:14.691199 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:14.691162 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-557d45fc88-bpgvk" podUID="1ff6c792-cd18-42a1-ba8b-7f472632c7df" containerName="console" probeResult="failure" output="Get \"https://10.134.0.26:8443/health\": dial tcp 10.134.0.26:8443: connect: connection refused" Apr 17 07:54:19.489607 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:19.489570 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:54:19.490054 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:19.489619 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:54:20.853764 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:20.853722 2567 patch_prober.go:28] interesting pod/console-799c8767ff-pld6g container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.25:8443/health\": dial tcp 10.134.0.25:8443: connect: connection refused" start-of-body= Apr 17 07:54:20.854147 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:20.853797 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-799c8767ff-pld6g" podUID="3f756eab-34b4-41f3-abc6-e71fe10ea19d" containerName="console" probeResult="failure" output="Get \"https://10.134.0.25:8443/health\": dial tcp 10.134.0.25:8443: connect: connection refused" Apr 17 07:54:21.528871 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:21.528840 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mspd5" Apr 17 07:54:24.693794 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:24.693759 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:24.697729 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:24.697710 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:54:24.761398 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:24.761367 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-799c8767ff-pld6g"] Apr 17 07:54:25.527999 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.527914 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b8db6fcbc-qdtzr" podUID="4cb4407b-fac2-416d-abdd-9452c01b5c16" containerName="console" containerID="cri-o://16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646" gracePeriod=15 Apr 17 07:54:25.778892 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.778838 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b8db6fcbc-qdtzr_4cb4407b-fac2-416d-abdd-9452c01b5c16/console/0.log" Apr 17 07:54:25.779176 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.778910 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:54:25.878996 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.878961 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-oauth-config\") pod \"4cb4407b-fac2-416d-abdd-9452c01b5c16\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " Apr 17 07:54:25.878996 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.879006 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmwfk\" (UniqueName: \"kubernetes.io/projected/4cb4407b-fac2-416d-abdd-9452c01b5c16-kube-api-access-qmwfk\") pod \"4cb4407b-fac2-416d-abdd-9452c01b5c16\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " Apr 17 07:54:25.879220 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.879039 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-serving-cert\") pod \"4cb4407b-fac2-416d-abdd-9452c01b5c16\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " Apr 17 07:54:25.879220 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.879090 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-oauth-serving-cert\") pod \"4cb4407b-fac2-416d-abdd-9452c01b5c16\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " Apr 17 07:54:25.879220 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.879125 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-service-ca\") pod \"4cb4407b-fac2-416d-abdd-9452c01b5c16\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " Apr 17 07:54:25.879220 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.879203 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-config\") pod \"4cb4407b-fac2-416d-abdd-9452c01b5c16\" (UID: \"4cb4407b-fac2-416d-abdd-9452c01b5c16\") " Apr 17 07:54:25.879602 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.879576 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4cb4407b-fac2-416d-abdd-9452c01b5c16" (UID: "4cb4407b-fac2-416d-abdd-9452c01b5c16"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:25.879680 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.879580 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-service-ca" (OuterVolumeSpecName: "service-ca") pod "4cb4407b-fac2-416d-abdd-9452c01b5c16" (UID: "4cb4407b-fac2-416d-abdd-9452c01b5c16"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:25.879680 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.879608 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-config" (OuterVolumeSpecName: "console-config") pod "4cb4407b-fac2-416d-abdd-9452c01b5c16" (UID: "4cb4407b-fac2-416d-abdd-9452c01b5c16"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:25.881554 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.881528 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4cb4407b-fac2-416d-abdd-9452c01b5c16" (UID: "4cb4407b-fac2-416d-abdd-9452c01b5c16"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:25.881851 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.881825 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb4407b-fac2-416d-abdd-9452c01b5c16-kube-api-access-qmwfk" (OuterVolumeSpecName: "kube-api-access-qmwfk") pod "4cb4407b-fac2-416d-abdd-9452c01b5c16" (UID: "4cb4407b-fac2-416d-abdd-9452c01b5c16"). InnerVolumeSpecName "kube-api-access-qmwfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:25.881851 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.881826 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4cb4407b-fac2-416d-abdd-9452c01b5c16" (UID: "4cb4407b-fac2-416d-abdd-9452c01b5c16"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:25.980738 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.980701 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:25.980913 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.980808 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-oauth-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:25.980913 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.980822 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-service-ca\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:25.980913 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.980835 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:25.980913 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.980844 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4cb4407b-fac2-416d-abdd-9452c01b5c16-console-oauth-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:25.980913 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:25.980853 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmwfk\" (UniqueName: \"kubernetes.io/projected/4cb4407b-fac2-416d-abdd-9452c01b5c16-kube-api-access-qmwfk\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:26.576614 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.576587 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b8db6fcbc-qdtzr_4cb4407b-fac2-416d-abdd-9452c01b5c16/console/0.log" Apr 17 07:54:26.576792 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.576628 2567 generic.go:358] "Generic (PLEG): container finished" podID="4cb4407b-fac2-416d-abdd-9452c01b5c16" containerID="16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646" exitCode=2 Apr 17 07:54:26.576792 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.576700 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8db6fcbc-qdtzr" Apr 17 07:54:26.576792 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.576710 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8db6fcbc-qdtzr" event={"ID":"4cb4407b-fac2-416d-abdd-9452c01b5c16","Type":"ContainerDied","Data":"16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646"} Apr 17 07:54:26.576792 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.576737 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8db6fcbc-qdtzr" event={"ID":"4cb4407b-fac2-416d-abdd-9452c01b5c16","Type":"ContainerDied","Data":"259906c7ca68fa2da806b3bd614c6bb29dd74401236d8ac5aea17c0d0c6e17f2"} Apr 17 07:54:26.576792 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.576752 2567 scope.go:117] "RemoveContainer" containerID="16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646" Apr 17 07:54:26.586311 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.586270 2567 scope.go:117] "RemoveContainer" containerID="16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646" Apr 17 07:54:26.586592 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:54:26.586567 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646\": container with ID starting with 16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646 not found: ID does not exist" containerID="16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646" Apr 17 07:54:26.586686 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.586602 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646"} err="failed to get container status \"16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646\": rpc error: code = NotFound desc = could not find container \"16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646\": container with ID starting with 16510db1f9479a9392642b3c466cc5ecbed4dbffa065e652551563374926f646 not found: ID does not exist" Apr 17 07:54:26.598430 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.598404 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b8db6fcbc-qdtzr"] Apr 17 07:54:26.602672 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:26.602650 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b8db6fcbc-qdtzr"] Apr 17 07:54:27.785647 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:27.785614 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb4407b-fac2-416d-abdd-9452c01b5c16" path="/var/lib/kubelet/pods/4cb4407b-fac2-416d-abdd-9452c01b5c16/volumes" Apr 17 07:54:29.328905 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.328860 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-765d9fcf8-7tpsf" podUID="1607537c-a1af-44e7-8796-bfced16b02f3" containerName="console" containerID="cri-o://abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5" gracePeriod=15 Apr 17 07:54:29.564891 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.564870 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-765d9fcf8-7tpsf_1607537c-a1af-44e7-8796-bfced16b02f3/console/0.log" Apr 17 07:54:29.565013 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.564930 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:54:29.588669 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.588600 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-765d9fcf8-7tpsf_1607537c-a1af-44e7-8796-bfced16b02f3/console/0.log" Apr 17 07:54:29.588669 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.588642 2567 generic.go:358] "Generic (PLEG): container finished" podID="1607537c-a1af-44e7-8796-bfced16b02f3" containerID="abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5" exitCode=2 Apr 17 07:54:29.588840 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.588714 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-765d9fcf8-7tpsf" Apr 17 07:54:29.588840 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.588730 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-765d9fcf8-7tpsf" event={"ID":"1607537c-a1af-44e7-8796-bfced16b02f3","Type":"ContainerDied","Data":"abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5"} Apr 17 07:54:29.588840 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.588765 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-765d9fcf8-7tpsf" event={"ID":"1607537c-a1af-44e7-8796-bfced16b02f3","Type":"ContainerDied","Data":"b736e9af32de4a9e36c6d235778ede254b0e05e39ee1f7f74e1805b66ed4e801"} Apr 17 07:54:29.588840 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.588779 2567 scope.go:117] "RemoveContainer" containerID="abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5" Apr 17 07:54:29.597910 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.597885 2567 scope.go:117] "RemoveContainer" containerID="abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5" Apr 17 07:54:29.598170 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:54:29.598150 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5\": container with ID starting with abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5 not found: ID does not exist" containerID="abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5" Apr 17 07:54:29.598247 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.598184 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5"} err="failed to get container status \"abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5\": rpc error: code = NotFound desc = could not find container \"abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5\": container with ID starting with abbf6ab3316770995f80855bf2cfd8445d58a79ce8caf0c1e90b2fa5dcf6eff5 not found: ID does not exist" Apr 17 07:54:29.609628 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.609601 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-console-config\") pod \"1607537c-a1af-44e7-8796-bfced16b02f3\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " Apr 17 07:54:29.609769 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.609654 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-oauth-config\") pod \"1607537c-a1af-44e7-8796-bfced16b02f3\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " Apr 17 07:54:29.609769 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.609694 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-oauth-serving-cert\") pod \"1607537c-a1af-44e7-8796-bfced16b02f3\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " Apr 17 07:54:29.609769 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.609751 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-trusted-ca-bundle\") pod \"1607537c-a1af-44e7-8796-bfced16b02f3\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " Apr 17 07:54:29.609922 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.609802 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b64ck\" (UniqueName: \"kubernetes.io/projected/1607537c-a1af-44e7-8796-bfced16b02f3-kube-api-access-b64ck\") pod \"1607537c-a1af-44e7-8796-bfced16b02f3\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " Apr 17 07:54:29.609922 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.609834 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-serving-cert\") pod \"1607537c-a1af-44e7-8796-bfced16b02f3\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " Apr 17 07:54:29.609922 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.609878 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-service-ca\") pod \"1607537c-a1af-44e7-8796-bfced16b02f3\" (UID: \"1607537c-a1af-44e7-8796-bfced16b02f3\") " Apr 17 07:54:29.610074 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.609955 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-console-config" (OuterVolumeSpecName: "console-config") pod "1607537c-a1af-44e7-8796-bfced16b02f3" (UID: "1607537c-a1af-44e7-8796-bfced16b02f3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:29.610156 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.610129 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1607537c-a1af-44e7-8796-bfced16b02f3" (UID: "1607537c-a1af-44e7-8796-bfced16b02f3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:29.610217 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.610153 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-console-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:29.610217 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.610171 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1607537c-a1af-44e7-8796-bfced16b02f3" (UID: "1607537c-a1af-44e7-8796-bfced16b02f3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:29.610404 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.610385 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-service-ca" (OuterVolumeSpecName: "service-ca") pod "1607537c-a1af-44e7-8796-bfced16b02f3" (UID: "1607537c-a1af-44e7-8796-bfced16b02f3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:29.612029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.611998 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1607537c-a1af-44e7-8796-bfced16b02f3-kube-api-access-b64ck" (OuterVolumeSpecName: "kube-api-access-b64ck") pod "1607537c-a1af-44e7-8796-bfced16b02f3" (UID: "1607537c-a1af-44e7-8796-bfced16b02f3"). InnerVolumeSpecName "kube-api-access-b64ck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:29.612029 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.612010 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1607537c-a1af-44e7-8796-bfced16b02f3" (UID: "1607537c-a1af-44e7-8796-bfced16b02f3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:29.612259 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.612234 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1607537c-a1af-44e7-8796-bfced16b02f3" (UID: "1607537c-a1af-44e7-8796-bfced16b02f3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:29.711020 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.710979 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-oauth-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:29.711020 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.711015 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-trusted-ca-bundle\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:29.711020 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.711028 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b64ck\" (UniqueName: \"kubernetes.io/projected/1607537c-a1af-44e7-8796-bfced16b02f3-kube-api-access-b64ck\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:29.711253 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.711040 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:29.711253 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.711052 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1607537c-a1af-44e7-8796-bfced16b02f3-service-ca\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:29.711253 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.711065 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1607537c-a1af-44e7-8796-bfced16b02f3-console-oauth-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:29.904013 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.903930 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-765d9fcf8-7tpsf"] Apr 17 07:54:29.907638 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:29.907614 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-765d9fcf8-7tpsf"] Apr 17 07:54:31.787744 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:31.787708 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1607537c-a1af-44e7-8796-bfced16b02f3" path="/var/lib/kubelet/pods/1607537c-a1af-44e7-8796-bfced16b02f3/volumes" Apr 17 07:54:39.495440 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:39.495408 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:54:39.499123 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:39.499106 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c648db48d-fm8f7" Apr 17 07:54:40.629846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:40.629810 2567 generic.go:358] "Generic (PLEG): container finished" podID="f9503e60-cd11-4c96-a718-f33e86501791" containerID="54e754585b57eb56dcedbf16307fb400718a255e9bc1ee24160b115e2efb329c" exitCode=0 Apr 17 07:54:40.630273 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:40.629858 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" event={"ID":"f9503e60-cd11-4c96-a718-f33e86501791","Type":"ContainerDied","Data":"54e754585b57eb56dcedbf16307fb400718a255e9bc1ee24160b115e2efb329c"} Apr 17 07:54:40.630273 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:40.630154 2567 scope.go:117] "RemoveContainer" containerID="54e754585b57eb56dcedbf16307fb400718a255e9bc1ee24160b115e2efb329c" Apr 17 07:54:41.634973 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:41.634938 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fwwvt" event={"ID":"f9503e60-cd11-4c96-a718-f33e86501791","Type":"ContainerStarted","Data":"500ca1b35afb60699171c90c5bab153d0529e1721bf804a78763889f94822bc9"} Apr 17 07:54:49.781745 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:49.781687 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-799c8767ff-pld6g" podUID="3f756eab-34b4-41f3-abc6-e71fe10ea19d" containerName="console" containerID="cri-o://a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1" gracePeriod=15 Apr 17 07:54:50.051904 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.051882 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-799c8767ff-pld6g_3f756eab-34b4-41f3-abc6-e71fe10ea19d/console/0.log" Apr 17 07:54:50.052017 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.051980 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:50.097119 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097088 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-oauth-serving-cert\") pod \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " Apr 17 07:54:50.097274 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097137 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-oauth-config\") pod \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " Apr 17 07:54:50.097274 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097165 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-service-ca\") pod \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " Apr 17 07:54:50.097274 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097191 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-serving-cert\") pod \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " Apr 17 07:54:50.097274 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097234 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-config\") pod \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " Apr 17 07:54:50.097274 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097255 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-trusted-ca-bundle\") pod \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " Apr 17 07:54:50.097544 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097348 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hzhn\" (UniqueName: \"kubernetes.io/projected/3f756eab-34b4-41f3-abc6-e71fe10ea19d-kube-api-access-6hzhn\") pod \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\" (UID: \"3f756eab-34b4-41f3-abc6-e71fe10ea19d\") " Apr 17 07:54:50.097544 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097460 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3f756eab-34b4-41f3-abc6-e71fe10ea19d" (UID: "3f756eab-34b4-41f3-abc6-e71fe10ea19d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:50.097644 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097633 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-oauth-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:50.097807 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097780 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-config" (OuterVolumeSpecName: "console-config") pod "3f756eab-34b4-41f3-abc6-e71fe10ea19d" (UID: "3f756eab-34b4-41f3-abc6-e71fe10ea19d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:50.097873 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097821 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3f756eab-34b4-41f3-abc6-e71fe10ea19d" (UID: "3f756eab-34b4-41f3-abc6-e71fe10ea19d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:50.097933 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.097915 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-service-ca" (OuterVolumeSpecName: "service-ca") pod "3f756eab-34b4-41f3-abc6-e71fe10ea19d" (UID: "3f756eab-34b4-41f3-abc6-e71fe10ea19d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:50.099633 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.099598 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3f756eab-34b4-41f3-abc6-e71fe10ea19d" (UID: "3f756eab-34b4-41f3-abc6-e71fe10ea19d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:50.099737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.099646 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3f756eab-34b4-41f3-abc6-e71fe10ea19d" (UID: "3f756eab-34b4-41f3-abc6-e71fe10ea19d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:50.099737 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.099676 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f756eab-34b4-41f3-abc6-e71fe10ea19d-kube-api-access-6hzhn" (OuterVolumeSpecName: "kube-api-access-6hzhn") pod "3f756eab-34b4-41f3-abc6-e71fe10ea19d" (UID: "3f756eab-34b4-41f3-abc6-e71fe10ea19d"). InnerVolumeSpecName "kube-api-access-6hzhn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:50.198557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.198521 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-oauth-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:50.198557 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.198555 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-service-ca\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:50.198752 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.198568 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:50.198752 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.198581 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-console-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:50.198752 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.198593 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f756eab-34b4-41f3-abc6-e71fe10ea19d-trusted-ca-bundle\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:50.198752 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.198605 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hzhn\" (UniqueName: \"kubernetes.io/projected/3f756eab-34b4-41f3-abc6-e71fe10ea19d-kube-api-access-6hzhn\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:54:50.669909 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.669880 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-799c8767ff-pld6g_3f756eab-34b4-41f3-abc6-e71fe10ea19d/console/0.log" Apr 17 07:54:50.670084 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.669924 2567 generic.go:358] "Generic (PLEG): container finished" podID="3f756eab-34b4-41f3-abc6-e71fe10ea19d" containerID="a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1" exitCode=2 Apr 17 07:54:50.670084 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.669958 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799c8767ff-pld6g" event={"ID":"3f756eab-34b4-41f3-abc6-e71fe10ea19d","Type":"ContainerDied","Data":"a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1"} Apr 17 07:54:50.670084 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.669993 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799c8767ff-pld6g" Apr 17 07:54:50.670084 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.670000 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799c8767ff-pld6g" event={"ID":"3f756eab-34b4-41f3-abc6-e71fe10ea19d","Type":"ContainerDied","Data":"4c5506c4d6b4f8d4df939eb376344bfd567ac27e961f99a1f397fca3349b9317"} Apr 17 07:54:50.670084 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.670016 2567 scope.go:117] "RemoveContainer" containerID="a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1" Apr 17 07:54:50.678723 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.678705 2567 scope.go:117] "RemoveContainer" containerID="a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1" Apr 17 07:54:50.679089 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:54:50.679067 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1\": container with ID starting with a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1 not found: ID does not exist" containerID="a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1" Apr 17 07:54:50.679153 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.679099 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1"} err="failed to get container status \"a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1\": rpc error: code = NotFound desc = could not find container \"a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1\": container with ID starting with a30084237942088ac76fc88dc302f01c762cca9d8f60b5a9ce70513bd42238c1 not found: ID does not exist" Apr 17 07:54:50.698350 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.698328 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-799c8767ff-pld6g"] Apr 17 07:54:50.701621 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:50.701595 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-799c8767ff-pld6g"] Apr 17 07:54:51.783516 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:54:51.783483 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f756eab-34b4-41f3-abc6-e71fe10ea19d" path="/var/lib/kubelet/pods/3f756eab-34b4-41f3-abc6-e71fe10ea19d/volumes" Apr 17 07:55:33.460985 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.460946 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-db59969c6-c57gh"] Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461331 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1607537c-a1af-44e7-8796-bfced16b02f3" containerName="console" Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461343 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1607537c-a1af-44e7-8796-bfced16b02f3" containerName="console" Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461354 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f756eab-34b4-41f3-abc6-e71fe10ea19d" containerName="console" Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461360 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f756eab-34b4-41f3-abc6-e71fe10ea19d" containerName="console" Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461376 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cb4407b-fac2-416d-abdd-9452c01b5c16" containerName="console" Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461382 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb4407b-fac2-416d-abdd-9452c01b5c16" containerName="console" Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461428 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1607537c-a1af-44e7-8796-bfced16b02f3" containerName="console" Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461436 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f756eab-34b4-41f3-abc6-e71fe10ea19d" containerName="console" Apr 17 07:55:33.461464 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.461445 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cb4407b-fac2-416d-abdd-9452c01b5c16" containerName="console" Apr 17 07:55:33.464495 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.464473 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.471001 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.470865 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-oauth-config\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.471001 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.470929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-service-ca\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.471551 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.471037 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-trusted-ca-bundle\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.471551 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.471075 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9spf\" (UniqueName: \"kubernetes.io/projected/6a8364f9-20d1-4c3d-bd17-94f7f0041431-kube-api-access-s9spf\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.471551 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.471106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-serving-cert\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.471551 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.471133 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-oauth-serving-cert\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.471551 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.471173 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-config\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.472949 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.472925 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-db59969c6-c57gh"] Apr 17 07:55:33.571913 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.571881 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-trusted-ca-bundle\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.571913 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.571917 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9spf\" (UniqueName: \"kubernetes.io/projected/6a8364f9-20d1-4c3d-bd17-94f7f0041431-kube-api-access-s9spf\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.572144 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.571938 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-serving-cert\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.572144 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.571967 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-oauth-serving-cert\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.572144 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.572001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-config\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.572144 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.572047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-oauth-config\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.572144 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.572093 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-service-ca\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.572920 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.572863 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-oauth-serving-cert\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.572920 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.572904 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-config\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.573121 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.573103 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-trusted-ca-bundle\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.573199 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.573176 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-service-ca\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.574654 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.574634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-oauth-config\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.574883 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.574863 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-serving-cert\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.582273 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.582249 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9spf\" (UniqueName: \"kubernetes.io/projected/6a8364f9-20d1-4c3d-bd17-94f7f0041431-kube-api-access-s9spf\") pod \"console-db59969c6-c57gh\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.774444 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.774406 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:33.898175 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:33.898152 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-db59969c6-c57gh"] Apr 17 07:55:33.900828 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:55:33.900804 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8364f9_20d1_4c3d_bd17_94f7f0041431.slice/crio-5dd27097bde8b1cc1d26c31a1f9826817d974a72f5464a558d8847d44218d1be WatchSource:0}: Error finding container 5dd27097bde8b1cc1d26c31a1f9826817d974a72f5464a558d8847d44218d1be: Status 404 returned error can't find the container with id 5dd27097bde8b1cc1d26c31a1f9826817d974a72f5464a558d8847d44218d1be Apr 17 07:55:34.821432 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:34.821390 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-db59969c6-c57gh" event={"ID":"6a8364f9-20d1-4c3d-bd17-94f7f0041431","Type":"ContainerStarted","Data":"d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce"} Apr 17 07:55:34.821432 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:34.821438 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-db59969c6-c57gh" event={"ID":"6a8364f9-20d1-4c3d-bd17-94f7f0041431","Type":"ContainerStarted","Data":"5dd27097bde8b1cc1d26c31a1f9826817d974a72f5464a558d8847d44218d1be"} Apr 17 07:55:34.837984 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:34.837931 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-db59969c6-c57gh" podStartSLOduration=1.8379121870000001 podStartE2EDuration="1.837912187s" podCreationTimestamp="2026-04-17 07:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:55:34.836818644 +0000 UTC m=+253.639769410" watchObservedRunningTime="2026-04-17 07:55:34.837912187 +0000 UTC m=+253.640862928" Apr 17 07:55:42.787767 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.787733 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m6m2c"] Apr 17 07:55:42.791260 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.791234 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.793485 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.793461 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:55:42.798780 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.798748 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m6m2c"] Apr 17 07:55:42.853340 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.853280 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-dbus\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.853508 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.853380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-kubelet-config\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.853508 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.853433 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-original-pull-secret\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.954496 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.954457 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-dbus\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.954675 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.954517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-kubelet-config\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.954675 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.954557 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-original-pull-secret\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.954675 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.954655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-kubelet-config\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.954843 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.954678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-dbus\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:42.956899 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:42.956878 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bddba28-b3c8-46bd-bb8e-166c73a7acbe-original-pull-secret\") pod \"global-pull-secret-syncer-m6m2c\" (UID: \"6bddba28-b3c8-46bd-bb8e-166c73a7acbe\") " pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:43.102866 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:43.102775 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m6m2c" Apr 17 07:55:43.225427 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:43.225393 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m6m2c"] Apr 17 07:55:43.228824 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:55:43.228795 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bddba28_b3c8_46bd_bb8e_166c73a7acbe.slice/crio-0af9bf8e49cd8dfb412416fbb6c615dbbde265f1dbce6513396849cde9790fb6 WatchSource:0}: Error finding container 0af9bf8e49cd8dfb412416fbb6c615dbbde265f1dbce6513396849cde9790fb6: Status 404 returned error can't find the container with id 0af9bf8e49cd8dfb412416fbb6c615dbbde265f1dbce6513396849cde9790fb6 Apr 17 07:55:43.775338 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:43.775217 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:43.775338 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:43.775280 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:43.784785 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:43.784761 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:43.852708 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:43.852672 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m6m2c" event={"ID":"6bddba28-b3c8-46bd-bb8e-166c73a7acbe","Type":"ContainerStarted","Data":"0af9bf8e49cd8dfb412416fbb6c615dbbde265f1dbce6513396849cde9790fb6"} Apr 17 07:55:43.857323 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:43.857280 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-db59969c6-c57gh" Apr 17 07:55:43.903306 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:43.903248 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-557d45fc88-bpgvk"] Apr 17 07:55:47.866404 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:47.866363 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m6m2c" event={"ID":"6bddba28-b3c8-46bd-bb8e-166c73a7acbe","Type":"ContainerStarted","Data":"f8aead0fd4d362cfaf76d24d39481e672ee14b0886e398cfe60e61ecc5a62f9e"} Apr 17 07:55:47.881357 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:55:47.881278 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m6m2c" podStartSLOduration=2.333109635 podStartE2EDuration="5.881261523s" podCreationTimestamp="2026-04-17 07:55:42 +0000 UTC" firstStartedPulling="2026-04-17 07:55:43.230422658 +0000 UTC m=+262.033373380" lastFinishedPulling="2026-04-17 07:55:46.778574548 +0000 UTC m=+265.581525268" observedRunningTime="2026-04-17 07:55:47.880074629 +0000 UTC m=+266.683025370" watchObservedRunningTime="2026-04-17 07:55:47.881261523 +0000 UTC m=+266.684212265" Apr 17 07:56:02.102425 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.102387 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm"] Apr 17 07:56:02.106257 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.106236 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.108865 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.108843 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 07:56:02.109776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.109760 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 07:56:02.109776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.109771 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mhll7\"" Apr 17 07:56:02.112994 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.112970 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm"] Apr 17 07:56:02.235184 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.235143 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.235396 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.235275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdznd\" (UniqueName: \"kubernetes.io/projected/6b247e82-9351-4d26-8cdd-827c57479cbf-kube-api-access-xdznd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.235396 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.235328 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.336074 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.336034 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdznd\" (UniqueName: \"kubernetes.io/projected/6b247e82-9351-4d26-8cdd-827c57479cbf-kube-api-access-xdznd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.336074 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.336077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.336221 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.336107 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.336538 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.336516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.336575 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.336537 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.344794 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.344771 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdznd\" (UniqueName: \"kubernetes.io/projected/6b247e82-9351-4d26-8cdd-827c57479cbf-kube-api-access-xdznd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.417043 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.416943 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:02.560358 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.560326 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm"] Apr 17 07:56:02.562914 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:56:02.562877 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b247e82_9351_4d26_8cdd_827c57479cbf.slice/crio-6b48beab5826b3e436cb0cc141801ea933ec6eee6ea68cbfd1e04e7aec416d4b WatchSource:0}: Error finding container 6b48beab5826b3e436cb0cc141801ea933ec6eee6ea68cbfd1e04e7aec416d4b: Status 404 returned error can't find the container with id 6b48beab5826b3e436cb0cc141801ea933ec6eee6ea68cbfd1e04e7aec416d4b Apr 17 07:56:02.917275 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:02.917240 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" event={"ID":"6b247e82-9351-4d26-8cdd-827c57479cbf","Type":"ContainerStarted","Data":"6b48beab5826b3e436cb0cc141801ea933ec6eee6ea68cbfd1e04e7aec416d4b"} Apr 17 07:56:08.928152 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:08.928091 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-557d45fc88-bpgvk" podUID="1ff6c792-cd18-42a1-ba8b-7f472632c7df" containerName="console" containerID="cri-o://52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6" gracePeriod=15 Apr 17 07:56:09.876908 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.876887 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-557d45fc88-bpgvk_1ff6c792-cd18-42a1-ba8b-7f472632c7df/console/0.log" Apr 17 07:56:09.877014 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.876950 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:56:09.949542 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.949501 2567 generic.go:358] "Generic (PLEG): container finished" podID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerID="c41f71175358368aa378026e3559c313571ba17a4a3f93eeb6db9d091a120071" exitCode=0 Apr 17 07:56:09.949951 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.949580 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" event={"ID":"6b247e82-9351-4d26-8cdd-827c57479cbf","Type":"ContainerDied","Data":"c41f71175358368aa378026e3559c313571ba17a4a3f93eeb6db9d091a120071"} Apr 17 07:56:09.950928 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.950913 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-557d45fc88-bpgvk_1ff6c792-cd18-42a1-ba8b-7f472632c7df/console/0.log" Apr 17 07:56:09.950995 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.950949 2567 generic.go:358] "Generic (PLEG): container finished" podID="1ff6c792-cd18-42a1-ba8b-7f472632c7df" containerID="52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6" exitCode=2 Apr 17 07:56:09.950995 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.950983 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557d45fc88-bpgvk" event={"ID":"1ff6c792-cd18-42a1-ba8b-7f472632c7df","Type":"ContainerDied","Data":"52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6"} Apr 17 07:56:09.951123 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.951005 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557d45fc88-bpgvk" event={"ID":"1ff6c792-cd18-42a1-ba8b-7f472632c7df","Type":"ContainerDied","Data":"96fc0896882f625c512d4ed117072fe5b974341933eab56bc63feec3d5038f37"} Apr 17 07:56:09.951123 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.951015 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557d45fc88-bpgvk" Apr 17 07:56:09.951123 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.951026 2567 scope.go:117] "RemoveContainer" containerID="52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6" Apr 17 07:56:09.965747 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.965723 2567 scope.go:117] "RemoveContainer" containerID="52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6" Apr 17 07:56:09.966065 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:09.966044 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6\": container with ID starting with 52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6 not found: ID does not exist" containerID="52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6" Apr 17 07:56:09.966139 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:09.966078 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6"} err="failed to get container status \"52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6\": rpc error: code = NotFound desc = could not find container \"52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6\": container with ID starting with 52026cc2a6f1e940e419f1b631314fd51cb2fdfc1be960af9dac1fb1683f54a6 not found: ID does not exist" Apr 17 07:56:10.005947 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.005916 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-oauth-config\") pod \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " Apr 17 07:56:10.006109 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.005969 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-oauth-serving-cert\") pod \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " Apr 17 07:56:10.006109 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006023 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-serving-cert\") pod \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " Apr 17 07:56:10.006109 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006044 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-service-ca\") pod \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " Apr 17 07:56:10.006109 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006069 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cm2r\" (UniqueName: \"kubernetes.io/projected/1ff6c792-cd18-42a1-ba8b-7f472632c7df-kube-api-access-2cm2r\") pod \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " Apr 17 07:56:10.006324 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006118 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-config\") pod \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " Apr 17 07:56:10.006324 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006148 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-trusted-ca-bundle\") pod \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\" (UID: \"1ff6c792-cd18-42a1-ba8b-7f472632c7df\") " Apr 17 07:56:10.006569 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006537 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1ff6c792-cd18-42a1-ba8b-7f472632c7df" (UID: "1ff6c792-cd18-42a1-ba8b-7f472632c7df"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:10.006709 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006565 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-service-ca" (OuterVolumeSpecName: "service-ca") pod "1ff6c792-cd18-42a1-ba8b-7f472632c7df" (UID: "1ff6c792-cd18-42a1-ba8b-7f472632c7df"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:10.006709 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006696 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-config" (OuterVolumeSpecName: "console-config") pod "1ff6c792-cd18-42a1-ba8b-7f472632c7df" (UID: "1ff6c792-cd18-42a1-ba8b-7f472632c7df"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:10.006821 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.006771 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1ff6c792-cd18-42a1-ba8b-7f472632c7df" (UID: "1ff6c792-cd18-42a1-ba8b-7f472632c7df"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:10.008504 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.008465 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1ff6c792-cd18-42a1-ba8b-7f472632c7df" (UID: "1ff6c792-cd18-42a1-ba8b-7f472632c7df"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:10.008588 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.008513 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff6c792-cd18-42a1-ba8b-7f472632c7df-kube-api-access-2cm2r" (OuterVolumeSpecName: "kube-api-access-2cm2r") pod "1ff6c792-cd18-42a1-ba8b-7f472632c7df" (UID: "1ff6c792-cd18-42a1-ba8b-7f472632c7df"). InnerVolumeSpecName "kube-api-access-2cm2r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:10.008588 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.008525 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1ff6c792-cd18-42a1-ba8b-7f472632c7df" (UID: "1ff6c792-cd18-42a1-ba8b-7f472632c7df"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:10.107457 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.107420 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-oauth-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:10.107457 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.107452 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-oauth-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:10.107457 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.107467 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:10.107706 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.107478 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-service-ca\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:10.107706 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.107487 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2cm2r\" (UniqueName: \"kubernetes.io/projected/1ff6c792-cd18-42a1-ba8b-7f472632c7df-kube-api-access-2cm2r\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:10.107706 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.107496 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-console-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:10.107706 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.107505 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ff6c792-cd18-42a1-ba8b-7f472632c7df-trusted-ca-bundle\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:10.273596 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.273568 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-557d45fc88-bpgvk"] Apr 17 07:56:10.277401 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:10.277378 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-557d45fc88-bpgvk"] Apr 17 07:56:11.789377 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:11.789342 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff6c792-cd18-42a1-ba8b-7f472632c7df" path="/var/lib/kubelet/pods/1ff6c792-cd18-42a1-ba8b-7f472632c7df/volumes" Apr 17 07:56:12.965228 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:12.965195 2567 generic.go:358] "Generic (PLEG): container finished" podID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerID="d27ca0576b2637e5b4d68290de7af1e77439e50a1f68d804a0ac6d63da7e0215" exitCode=0 Apr 17 07:56:12.965634 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:12.965260 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" event={"ID":"6b247e82-9351-4d26-8cdd-827c57479cbf","Type":"ContainerDied","Data":"d27ca0576b2637e5b4d68290de7af1e77439e50a1f68d804a0ac6d63da7e0215"} Apr 17 07:56:21.696461 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:21.696386 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 07:56:21.696990 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:21.696969 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 07:56:21.703215 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:21.703180 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 07:56:21.704076 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:21.704058 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 07:56:21.706545 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:21.706525 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 07:56:21.998189 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:21.997991 2567 generic.go:358] "Generic (PLEG): container finished" podID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerID="b857d1ac297e1217ed1446c766466d1db22539f29a9db535f094412772289f1a" exitCode=0 Apr 17 07:56:21.998189 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:21.998054 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" event={"ID":"6b247e82-9351-4d26-8cdd-827c57479cbf","Type":"ContainerDied","Data":"b857d1ac297e1217ed1446c766466d1db22539f29a9db535f094412772289f1a"} Apr 17 07:56:23.126999 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.126976 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:23.214607 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.214577 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdznd\" (UniqueName: \"kubernetes.io/projected/6b247e82-9351-4d26-8cdd-827c57479cbf-kube-api-access-xdznd\") pod \"6b247e82-9351-4d26-8cdd-827c57479cbf\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " Apr 17 07:56:23.214776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.214676 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-util\") pod \"6b247e82-9351-4d26-8cdd-827c57479cbf\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " Apr 17 07:56:23.214776 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.214720 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-bundle\") pod \"6b247e82-9351-4d26-8cdd-827c57479cbf\" (UID: \"6b247e82-9351-4d26-8cdd-827c57479cbf\") " Apr 17 07:56:23.215438 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.215391 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-bundle" (OuterVolumeSpecName: "bundle") pod "6b247e82-9351-4d26-8cdd-827c57479cbf" (UID: "6b247e82-9351-4d26-8cdd-827c57479cbf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:23.216958 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.216932 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b247e82-9351-4d26-8cdd-827c57479cbf-kube-api-access-xdznd" (OuterVolumeSpecName: "kube-api-access-xdznd") pod "6b247e82-9351-4d26-8cdd-827c57479cbf" (UID: "6b247e82-9351-4d26-8cdd-827c57479cbf"). InnerVolumeSpecName "kube-api-access-xdznd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:23.219345 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.219325 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-util" (OuterVolumeSpecName: "util") pod "6b247e82-9351-4d26-8cdd-827c57479cbf" (UID: "6b247e82-9351-4d26-8cdd-827c57479cbf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:23.315612 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.315521 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-util\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:23.315612 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.315563 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b247e82-9351-4d26-8cdd-827c57479cbf-bundle\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:23.315612 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:23.315574 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xdznd\" (UniqueName: \"kubernetes.io/projected/6b247e82-9351-4d26-8cdd-827c57479cbf-kube-api-access-xdznd\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:56:24.006736 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:24.006695 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" event={"ID":"6b247e82-9351-4d26-8cdd-827c57479cbf","Type":"ContainerDied","Data":"6b48beab5826b3e436cb0cc141801ea933ec6eee6ea68cbfd1e04e7aec416d4b"} Apr 17 07:56:24.006736 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:24.006737 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b48beab5826b3e436cb0cc141801ea933ec6eee6ea68cbfd1e04e7aec416d4b" Apr 17 07:56:24.006929 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:24.006720 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6ltkm" Apr 17 07:56:28.855427 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855392 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh"] Apr 17 07:56:28.855931 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855848 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerName="extract" Apr 17 07:56:28.855931 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855868 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerName="extract" Apr 17 07:56:28.855931 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855888 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ff6c792-cd18-42a1-ba8b-7f472632c7df" containerName="console" Apr 17 07:56:28.855931 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855897 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff6c792-cd18-42a1-ba8b-7f472632c7df" containerName="console" Apr 17 07:56:28.855931 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855917 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerName="pull" Apr 17 07:56:28.855931 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855926 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerName="pull" Apr 17 07:56:28.856188 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855943 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerName="util" Apr 17 07:56:28.856188 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.855950 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerName="util" Apr 17 07:56:28.856188 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.856036 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b247e82-9351-4d26-8cdd-827c57479cbf" containerName="extract" Apr 17 07:56:28.856188 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.856049 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ff6c792-cd18-42a1-ba8b-7f472632c7df" containerName="console" Apr 17 07:56:28.858994 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.858974 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:28.862543 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.862520 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 07:56:28.862660 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.862645 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 07:56:28.862731 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.862548 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-2nhpg\"" Apr 17 07:56:28.862791 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.862606 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 07:56:28.869766 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.869745 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh"] Apr 17 07:56:28.964267 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.964236 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/63e373ea-33f9-441c-a5b4-494e219507d5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh\" (UID: \"63e373ea-33f9-441c-a5b4-494e219507d5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:28.964437 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:28.964318 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q4nt\" (UniqueName: \"kubernetes.io/projected/63e373ea-33f9-441c-a5b4-494e219507d5-kube-api-access-7q4nt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh\" (UID: \"63e373ea-33f9-441c-a5b4-494e219507d5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:29.065388 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:29.065353 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/63e373ea-33f9-441c-a5b4-494e219507d5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh\" (UID: \"63e373ea-33f9-441c-a5b4-494e219507d5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:29.065388 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:29.065400 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q4nt\" (UniqueName: \"kubernetes.io/projected/63e373ea-33f9-441c-a5b4-494e219507d5-kube-api-access-7q4nt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh\" (UID: \"63e373ea-33f9-441c-a5b4-494e219507d5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:29.067839 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:29.067809 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/63e373ea-33f9-441c-a5b4-494e219507d5-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh\" (UID: \"63e373ea-33f9-441c-a5b4-494e219507d5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:29.073335 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:29.073309 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q4nt\" (UniqueName: \"kubernetes.io/projected/63e373ea-33f9-441c-a5b4-494e219507d5-kube-api-access-7q4nt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh\" (UID: \"63e373ea-33f9-441c-a5b4-494e219507d5\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:29.172766 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:29.172677 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:29.297808 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:29.297785 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh"] Apr 17 07:56:29.300427 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:56:29.300400 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e373ea_33f9_441c_a5b4_494e219507d5.slice/crio-7c4e04d24458f548528ddd08a9e0bd2ecfa2785d0c865c018e27574d6cd4fc9e WatchSource:0}: Error finding container 7c4e04d24458f548528ddd08a9e0bd2ecfa2785d0c865c018e27574d6cd4fc9e: Status 404 returned error can't find the container with id 7c4e04d24458f548528ddd08a9e0bd2ecfa2785d0c865c018e27574d6cd4fc9e Apr 17 07:56:29.302087 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:29.302069 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:56:30.025996 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:30.025954 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" event={"ID":"63e373ea-33f9-441c-a5b4-494e219507d5","Type":"ContainerStarted","Data":"7c4e04d24458f548528ddd08a9e0bd2ecfa2785d0c865c018e27574d6cd4fc9e"} Apr 17 07:56:33.039499 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.039416 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" event={"ID":"63e373ea-33f9-441c-a5b4-494e219507d5","Type":"ContainerStarted","Data":"f926f9c3868fb85e889b7abc5a2ff8e2ce717a701bd676f681ae19e36bcc9a84"} Apr 17 07:56:33.039499 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.039473 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:33.059180 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.059137 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" podStartSLOduration=1.577341927 podStartE2EDuration="5.059124177s" podCreationTimestamp="2026-04-17 07:56:28 +0000 UTC" firstStartedPulling="2026-04-17 07:56:29.302224987 +0000 UTC m=+308.105175706" lastFinishedPulling="2026-04-17 07:56:32.784007233 +0000 UTC m=+311.586957956" observedRunningTime="2026-04-17 07:56:33.055664494 +0000 UTC m=+311.858615216" watchObservedRunningTime="2026-04-17 07:56:33.059124177 +0000 UTC m=+311.862074918" Apr 17 07:56:33.325328 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.325234 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5f9tf"] Apr 17 07:56:33.328736 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.328717 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.331327 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.331122 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 07:56:33.331327 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.331235 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-zdt4m\"" Apr 17 07:56:33.331541 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.331524 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 07:56:33.338710 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.338500 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5f9tf"] Apr 17 07:56:33.405870 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.405838 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2brhb\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-kube-api-access-2brhb\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.406070 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.405888 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/87df04cc-4f8c-430e-8c31-20f75305cbd0-cabundle0\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.406070 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.405923 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.506616 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.506576 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2brhb\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-kube-api-access-2brhb\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.506818 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.506631 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/87df04cc-4f8c-430e-8c31-20f75305cbd0-cabundle0\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.506818 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.506676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.506818 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.506783 2567 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 17 07:56:33.506818 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.506797 2567 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:56:33.506818 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.506804 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:56:33.506818 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.506820 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5f9tf: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 07:56:33.507117 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.506883 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates podName:87df04cc-4f8c-430e-8c31-20f75305cbd0 nodeName:}" failed. No retries permitted until 2026-04-17 07:56:34.006865109 +0000 UTC m=+312.809815845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates") pod "keda-operator-ffbb595cb-5f9tf" (UID: "87df04cc-4f8c-430e-8c31-20f75305cbd0") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 07:56:33.507238 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.507213 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/87df04cc-4f8c-430e-8c31-20f75305cbd0-cabundle0\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.515132 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.515108 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2brhb\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-kube-api-access-2brhb\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:33.722501 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.722419 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z"] Apr 17 07:56:33.726014 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.725998 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.729605 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.729578 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 07:56:33.739812 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.739789 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z"] Apr 17 07:56:33.809787 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.809755 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.809958 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.809837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f437f57d-649c-4fff-b5e5-216ec1191f1d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.809958 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.809880 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rv8v\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-kube-api-access-5rv8v\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.910838 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.910804 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f437f57d-649c-4fff-b5e5-216ec1191f1d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.911021 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.910936 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rv8v\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-kube-api-access-5rv8v\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.911090 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.911076 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.911218 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.911201 2567 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:56:33.911325 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.911221 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:56:33.911325 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.911240 2567 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 07:56:33.911325 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.911264 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 07:56:33.911551 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:33.911352 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates podName:f437f57d-649c-4fff-b5e5-216ec1191f1d nodeName:}" failed. No retries permitted until 2026-04-17 07:56:34.411332663 +0000 UTC m=+313.214283394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates") pod "keda-metrics-apiserver-7c9f485588-6zm8z" (UID: "f437f57d-649c-4fff-b5e5-216ec1191f1d") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 07:56:33.911638 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.911616 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f437f57d-649c-4fff-b5e5-216ec1191f1d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.921985 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.921958 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-9nknm"] Apr 17 07:56:33.925240 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.925218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:33.925573 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.925552 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rv8v\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-kube-api-access-5rv8v\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:33.927506 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.927487 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 07:56:33.934404 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:33.933986 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-9nknm"] Apr 17 07:56:34.011592 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.011564 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-certificates\") pod \"keda-admission-cf49989db-9nknm\" (UID: \"832046ef-50dd-4fcb-88f4-bd27b2612cb2\") " pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:34.011745 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.011611 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:34.011745 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.011638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xghbc\" (UniqueName: \"kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-kube-api-access-xghbc\") pod \"keda-admission-cf49989db-9nknm\" (UID: \"832046ef-50dd-4fcb-88f4-bd27b2612cb2\") " pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:34.011818 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.011765 2567 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:56:34.011818 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.011789 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:56:34.011818 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.011800 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5f9tf: references non-existent secret key: ca.crt Apr 17 07:56:34.011907 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.011861 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates podName:87df04cc-4f8c-430e-8c31-20f75305cbd0 nodeName:}" failed. No retries permitted until 2026-04-17 07:56:35.011842971 +0000 UTC m=+313.814793695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates") pod "keda-operator-ffbb595cb-5f9tf" (UID: "87df04cc-4f8c-430e-8c31-20f75305cbd0") : references non-existent secret key: ca.crt Apr 17 07:56:34.113086 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.113047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-certificates\") pod \"keda-admission-cf49989db-9nknm\" (UID: \"832046ef-50dd-4fcb-88f4-bd27b2612cb2\") " pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:34.113574 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.113175 2567 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 17 07:56:34.113574 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.113194 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-9nknm: secret "keda-admission-webhooks-certs" not found Apr 17 07:56:34.113574 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.113248 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-certificates podName:832046ef-50dd-4fcb-88f4-bd27b2612cb2 nodeName:}" failed. No retries permitted until 2026-04-17 07:56:34.613226287 +0000 UTC m=+313.416177007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-certificates") pod "keda-admission-cf49989db-9nknm" (UID: "832046ef-50dd-4fcb-88f4-bd27b2612cb2") : secret "keda-admission-webhooks-certs" not found Apr 17 07:56:34.113574 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.113319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xghbc\" (UniqueName: \"kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-kube-api-access-xghbc\") pod \"keda-admission-cf49989db-9nknm\" (UID: \"832046ef-50dd-4fcb-88f4-bd27b2612cb2\") " pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:34.122960 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.122930 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xghbc\" (UniqueName: \"kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-kube-api-access-xghbc\") pod \"keda-admission-cf49989db-9nknm\" (UID: \"832046ef-50dd-4fcb-88f4-bd27b2612cb2\") " pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:34.417037 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.416950 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:34.417171 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.417094 2567 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:56:34.417171 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.417114 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:56:34.417171 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.417134 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z: references non-existent secret key: tls.crt Apr 17 07:56:34.417264 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:34.417187 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates podName:f437f57d-649c-4fff-b5e5-216ec1191f1d nodeName:}" failed. No retries permitted until 2026-04-17 07:56:35.417173737 +0000 UTC m=+314.220124456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates") pod "keda-metrics-apiserver-7c9f485588-6zm8z" (UID: "f437f57d-649c-4fff-b5e5-216ec1191f1d") : references non-existent secret key: tls.crt Apr 17 07:56:34.618642 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.618602 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-certificates\") pod \"keda-admission-cf49989db-9nknm\" (UID: \"832046ef-50dd-4fcb-88f4-bd27b2612cb2\") " pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:34.621207 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.621189 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/832046ef-50dd-4fcb-88f4-bd27b2612cb2-certificates\") pod \"keda-admission-cf49989db-9nknm\" (UID: \"832046ef-50dd-4fcb-88f4-bd27b2612cb2\") " pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:34.848826 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.848790 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:34.984347 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:34.984317 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-9nknm"] Apr 17 07:56:34.987147 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:56:34.987108 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod832046ef_50dd_4fcb_88f4_bd27b2612cb2.slice/crio-e408af7d30825e341c0c0d54c3fd9942a1ac3094b09d855a7eafdf7488e20bb2 WatchSource:0}: Error finding container e408af7d30825e341c0c0d54c3fd9942a1ac3094b09d855a7eafdf7488e20bb2: Status 404 returned error can't find the container with id e408af7d30825e341c0c0d54c3fd9942a1ac3094b09d855a7eafdf7488e20bb2 Apr 17 07:56:35.021798 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:35.021772 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:35.021939 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:35.021922 2567 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:56:35.021993 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:35.021943 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:56:35.021993 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:35.021951 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5f9tf: references non-existent secret key: ca.crt Apr 17 07:56:35.022050 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:35.022004 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates podName:87df04cc-4f8c-430e-8c31-20f75305cbd0 nodeName:}" failed. No retries permitted until 2026-04-17 07:56:37.021988769 +0000 UTC m=+315.824939487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates") pod "keda-operator-ffbb595cb-5f9tf" (UID: "87df04cc-4f8c-430e-8c31-20f75305cbd0") : references non-existent secret key: ca.crt Apr 17 07:56:35.047390 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:35.047357 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-9nknm" event={"ID":"832046ef-50dd-4fcb-88f4-bd27b2612cb2","Type":"ContainerStarted","Data":"e408af7d30825e341c0c0d54c3fd9942a1ac3094b09d855a7eafdf7488e20bb2"} Apr 17 07:56:35.425914 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:35.425877 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:35.426267 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:35.426032 2567 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:56:35.426267 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:35.426048 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:56:35.426267 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:35.426065 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z: references non-existent secret key: tls.crt Apr 17 07:56:35.426267 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:56:35.426122 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates podName:f437f57d-649c-4fff-b5e5-216ec1191f1d nodeName:}" failed. No retries permitted until 2026-04-17 07:56:37.426104807 +0000 UTC m=+316.229055529 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates") pod "keda-metrics-apiserver-7c9f485588-6zm8z" (UID: "f437f57d-649c-4fff-b5e5-216ec1191f1d") : references non-existent secret key: tls.crt Apr 17 07:56:37.044417 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.044378 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:37.046924 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.046895 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87df04cc-4f8c-430e-8c31-20f75305cbd0-certificates\") pod \"keda-operator-ffbb595cb-5f9tf\" (UID: \"87df04cc-4f8c-430e-8c31-20f75305cbd0\") " pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:37.055952 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.055915 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-9nknm" event={"ID":"832046ef-50dd-4fcb-88f4-bd27b2612cb2","Type":"ContainerStarted","Data":"42b3b5d1709144f8ec8bdd04738cb8a8206dc7a862ab268360d4fec50f8db586"} Apr 17 07:56:37.056069 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.056052 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:56:37.073579 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.073532 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-9nknm" podStartSLOduration=2.410616172 podStartE2EDuration="4.073517843s" podCreationTimestamp="2026-04-17 07:56:33 +0000 UTC" firstStartedPulling="2026-04-17 07:56:34.988485642 +0000 UTC m=+313.791436365" lastFinishedPulling="2026-04-17 07:56:36.651387304 +0000 UTC m=+315.454338036" observedRunningTime="2026-04-17 07:56:37.07097765 +0000 UTC m=+315.873928388" watchObservedRunningTime="2026-04-17 07:56:37.073517843 +0000 UTC m=+315.876468581" Apr 17 07:56:37.242174 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.242136 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:37.367229 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.367201 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5f9tf"] Apr 17 07:56:37.369224 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:56:37.369191 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87df04cc_4f8c_430e_8c31_20f75305cbd0.slice/crio-72ae595b7a2d889561a06faee9ed86d6f52cadc290b43e9b25a700046eec1ace WatchSource:0}: Error finding container 72ae595b7a2d889561a06faee9ed86d6f52cadc290b43e9b25a700046eec1ace: Status 404 returned error can't find the container with id 72ae595b7a2d889561a06faee9ed86d6f52cadc290b43e9b25a700046eec1ace Apr 17 07:56:37.447902 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.447868 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:37.450477 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.450448 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f437f57d-649c-4fff-b5e5-216ec1191f1d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6zm8z\" (UID: \"f437f57d-649c-4fff-b5e5-216ec1191f1d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:37.636525 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.636436 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:37.757332 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:37.757303 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z"] Apr 17 07:56:37.759333 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:56:37.759306 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf437f57d_649c_4fff_b5e5_216ec1191f1d.slice/crio-91b665e53c9aabaff98632b6784721b28cb3dc80dfc10e4a86f8feae7f69b7e2 WatchSource:0}: Error finding container 91b665e53c9aabaff98632b6784721b28cb3dc80dfc10e4a86f8feae7f69b7e2: Status 404 returned error can't find the container with id 91b665e53c9aabaff98632b6784721b28cb3dc80dfc10e4a86f8feae7f69b7e2 Apr 17 07:56:38.060304 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:38.060255 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" event={"ID":"87df04cc-4f8c-430e-8c31-20f75305cbd0","Type":"ContainerStarted","Data":"72ae595b7a2d889561a06faee9ed86d6f52cadc290b43e9b25a700046eec1ace"} Apr 17 07:56:38.061191 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:38.061168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" event={"ID":"f437f57d-649c-4fff-b5e5-216ec1191f1d","Type":"ContainerStarted","Data":"91b665e53c9aabaff98632b6784721b28cb3dc80dfc10e4a86f8feae7f69b7e2"} Apr 17 07:56:42.089089 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:42.089048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" event={"ID":"87df04cc-4f8c-430e-8c31-20f75305cbd0","Type":"ContainerStarted","Data":"c8045e3ae3075f063603b43f606d3d1e018d6fe3102088329c1efeea2f233f90"} Apr 17 07:56:42.089573 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:42.089305 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:56:42.090771 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:42.090748 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" event={"ID":"f437f57d-649c-4fff-b5e5-216ec1191f1d","Type":"ContainerStarted","Data":"85d624d633b8a9c96a29d4d67ab6c0f9c54c1d7e88398683266a64601cc41086"} Apr 17 07:56:42.090903 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:42.090886 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:42.104982 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:42.104922 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" podStartSLOduration=4.474643309 podStartE2EDuration="9.104907438s" podCreationTimestamp="2026-04-17 07:56:33 +0000 UTC" firstStartedPulling="2026-04-17 07:56:37.370642259 +0000 UTC m=+316.173592977" lastFinishedPulling="2026-04-17 07:56:42.000906383 +0000 UTC m=+320.803857106" observedRunningTime="2026-04-17 07:56:42.103675308 +0000 UTC m=+320.906626053" watchObservedRunningTime="2026-04-17 07:56:42.104907438 +0000 UTC m=+320.907858179" Apr 17 07:56:42.121540 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:42.121484 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" podStartSLOduration=4.943190479 podStartE2EDuration="9.12146982s" podCreationTimestamp="2026-04-17 07:56:33 +0000 UTC" firstStartedPulling="2026-04-17 07:56:37.760627044 +0000 UTC m=+316.563577764" lastFinishedPulling="2026-04-17 07:56:41.938906387 +0000 UTC m=+320.741857105" observedRunningTime="2026-04-17 07:56:42.119889465 +0000 UTC m=+320.922840216" watchObservedRunningTime="2026-04-17 07:56:42.12146982 +0000 UTC m=+320.924420560" Apr 17 07:56:53.099047 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:53.099017 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6zm8z" Apr 17 07:56:54.045589 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:54.045557 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xz7nh" Apr 17 07:56:58.064337 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:56:58.064273 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-9nknm" Apr 17 07:57:03.097314 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:03.097214 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-5f9tf" Apr 17 07:57:39.822812 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.822775 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-5f85w"] Apr 17 07:57:39.826770 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.826751 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:39.829682 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.829649 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ml8vh\"" Apr 17 07:57:39.829682 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.829649 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 07:57:39.829961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.829650 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 07:57:39.829961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.829650 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 07:57:39.836821 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.836777 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-5f85w"] Apr 17 07:57:39.909495 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.909457 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdrr\" (UniqueName: \"kubernetes.io/projected/647fc402-09c1-4868-90f2-a04650ef09d1-kube-api-access-dhdrr\") pod \"seaweedfs-86cc847c5c-5f85w\" (UID: \"647fc402-09c1-4868-90f2-a04650ef09d1\") " pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:39.909702 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:39.909542 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/647fc402-09c1-4868-90f2-a04650ef09d1-data\") pod \"seaweedfs-86cc847c5c-5f85w\" (UID: \"647fc402-09c1-4868-90f2-a04650ef09d1\") " pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:40.010923 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:40.010886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdrr\" (UniqueName: \"kubernetes.io/projected/647fc402-09c1-4868-90f2-a04650ef09d1-kube-api-access-dhdrr\") pod \"seaweedfs-86cc847c5c-5f85w\" (UID: \"647fc402-09c1-4868-90f2-a04650ef09d1\") " pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:40.011115 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:40.010965 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/647fc402-09c1-4868-90f2-a04650ef09d1-data\") pod \"seaweedfs-86cc847c5c-5f85w\" (UID: \"647fc402-09c1-4868-90f2-a04650ef09d1\") " pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:40.011374 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:40.011354 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/647fc402-09c1-4868-90f2-a04650ef09d1-data\") pod \"seaweedfs-86cc847c5c-5f85w\" (UID: \"647fc402-09c1-4868-90f2-a04650ef09d1\") " pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:40.019932 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:40.019905 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdrr\" (UniqueName: \"kubernetes.io/projected/647fc402-09c1-4868-90f2-a04650ef09d1-kube-api-access-dhdrr\") pod \"seaweedfs-86cc847c5c-5f85w\" (UID: \"647fc402-09c1-4868-90f2-a04650ef09d1\") " pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:40.138940 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:40.138853 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:40.269732 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:40.269681 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-5f85w"] Apr 17 07:57:40.272101 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:57:40.272064 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647fc402_09c1_4868_90f2_a04650ef09d1.slice/crio-23175de2bb1008378ddb4b77d4208ffc69da8c6056e8e2d4026ae63d0fdac9b0 WatchSource:0}: Error finding container 23175de2bb1008378ddb4b77d4208ffc69da8c6056e8e2d4026ae63d0fdac9b0: Status 404 returned error can't find the container with id 23175de2bb1008378ddb4b77d4208ffc69da8c6056e8e2d4026ae63d0fdac9b0 Apr 17 07:57:40.302889 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:40.302851 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-5f85w" event={"ID":"647fc402-09c1-4868-90f2-a04650ef09d1","Type":"ContainerStarted","Data":"23175de2bb1008378ddb4b77d4208ffc69da8c6056e8e2d4026ae63d0fdac9b0"} Apr 17 07:57:43.316199 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:43.316105 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-5f85w" event={"ID":"647fc402-09c1-4868-90f2-a04650ef09d1","Type":"ContainerStarted","Data":"4f20f1291aab3c3f44c363e1275a93c0321afdc8e30246c9ebfad0dde1fb1340"} Apr 17 07:57:43.316575 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:43.316248 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:57:43.333501 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:43.333453 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-5f85w" podStartSLOduration=1.578812229 podStartE2EDuration="4.333439684s" podCreationTimestamp="2026-04-17 07:57:39 +0000 UTC" firstStartedPulling="2026-04-17 07:57:40.273469904 +0000 UTC m=+379.076420624" lastFinishedPulling="2026-04-17 07:57:43.028097343 +0000 UTC m=+381.831048079" observedRunningTime="2026-04-17 07:57:43.332098594 +0000 UTC m=+382.135049334" watchObservedRunningTime="2026-04-17 07:57:43.333439684 +0000 UTC m=+382.136390425" Apr 17 07:57:49.321929 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:57:49.321899 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-5f85w" Apr 17 07:58:50.277867 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.277836 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-xp9q2"] Apr 17 07:58:50.281533 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.281511 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:50.284264 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.284241 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-7s2vs\"" Apr 17 07:58:50.284555 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.284531 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 07:58:50.291948 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.291907 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-xp9q2"] Apr 17 07:58:50.295566 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.295540 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-wlvqm"] Apr 17 07:58:50.299333 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.299313 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:50.301718 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.301699 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 07:58:50.301848 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.301785 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-5nw6w\"" Apr 17 07:58:50.306337 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.306315 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wlvqm"] Apr 17 07:58:50.323633 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.323597 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4552494-143e-4167-9599-7018935accc9-tls-certs\") pod \"model-serving-api-86f7b4b499-xp9q2\" (UID: \"c4552494-143e-4167-9599-7018935accc9\") " pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:50.323773 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.323669 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c6m7\" (UniqueName: \"kubernetes.io/projected/c4552494-143e-4167-9599-7018935accc9-kube-api-access-6c6m7\") pod \"model-serving-api-86f7b4b499-xp9q2\" (UID: \"c4552494-143e-4167-9599-7018935accc9\") " pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:50.323773 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.323692 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7g9t\" (UniqueName: \"kubernetes.io/projected/972f2ad3-6129-4d85-9ba7-6d661a492dad-kube-api-access-c7g9t\") pod \"odh-model-controller-696fc77849-wlvqm\" (UID: \"972f2ad3-6129-4d85-9ba7-6d661a492dad\") " pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:50.323773 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.323709 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/972f2ad3-6129-4d85-9ba7-6d661a492dad-cert\") pod \"odh-model-controller-696fc77849-wlvqm\" (UID: \"972f2ad3-6129-4d85-9ba7-6d661a492dad\") " pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:50.424177 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.424141 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c6m7\" (UniqueName: \"kubernetes.io/projected/c4552494-143e-4167-9599-7018935accc9-kube-api-access-6c6m7\") pod \"model-serving-api-86f7b4b499-xp9q2\" (UID: \"c4552494-143e-4167-9599-7018935accc9\") " pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:50.424177 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.424182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7g9t\" (UniqueName: \"kubernetes.io/projected/972f2ad3-6129-4d85-9ba7-6d661a492dad-kube-api-access-c7g9t\") pod \"odh-model-controller-696fc77849-wlvqm\" (UID: \"972f2ad3-6129-4d85-9ba7-6d661a492dad\") " pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:50.424465 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.424203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/972f2ad3-6129-4d85-9ba7-6d661a492dad-cert\") pod \"odh-model-controller-696fc77849-wlvqm\" (UID: \"972f2ad3-6129-4d85-9ba7-6d661a492dad\") " pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:50.424465 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.424270 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4552494-143e-4167-9599-7018935accc9-tls-certs\") pod \"model-serving-api-86f7b4b499-xp9q2\" (UID: \"c4552494-143e-4167-9599-7018935accc9\") " pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:50.424465 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:58:50.424403 2567 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 17 07:58:50.424465 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:58:50.424467 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4552494-143e-4167-9599-7018935accc9-tls-certs podName:c4552494-143e-4167-9599-7018935accc9 nodeName:}" failed. No retries permitted until 2026-04-17 07:58:50.924448406 +0000 UTC m=+449.727399136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c4552494-143e-4167-9599-7018935accc9-tls-certs") pod "model-serving-api-86f7b4b499-xp9q2" (UID: "c4552494-143e-4167-9599-7018935accc9") : secret "model-serving-api-tls" not found Apr 17 07:58:50.426867 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.426840 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/972f2ad3-6129-4d85-9ba7-6d661a492dad-cert\") pod \"odh-model-controller-696fc77849-wlvqm\" (UID: \"972f2ad3-6129-4d85-9ba7-6d661a492dad\") " pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:50.433522 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.433499 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7g9t\" (UniqueName: \"kubernetes.io/projected/972f2ad3-6129-4d85-9ba7-6d661a492dad-kube-api-access-c7g9t\") pod \"odh-model-controller-696fc77849-wlvqm\" (UID: \"972f2ad3-6129-4d85-9ba7-6d661a492dad\") " pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:50.434764 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.434729 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c6m7\" (UniqueName: \"kubernetes.io/projected/c4552494-143e-4167-9599-7018935accc9-kube-api-access-6c6m7\") pod \"model-serving-api-86f7b4b499-xp9q2\" (UID: \"c4552494-143e-4167-9599-7018935accc9\") " pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:50.613679 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.613592 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:50.742763 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.742739 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wlvqm"] Apr 17 07:58:50.745365 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:58:50.745335 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod972f2ad3_6129_4d85_9ba7_6d661a492dad.slice/crio-cb667999b43d4bfe2ce2258775af3aee5ca82b977ad92383da311c634a65f7ee WatchSource:0}: Error finding container cb667999b43d4bfe2ce2258775af3aee5ca82b977ad92383da311c634a65f7ee: Status 404 returned error can't find the container with id cb667999b43d4bfe2ce2258775af3aee5ca82b977ad92383da311c634a65f7ee Apr 17 07:58:50.927583 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.927496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4552494-143e-4167-9599-7018935accc9-tls-certs\") pod \"model-serving-api-86f7b4b499-xp9q2\" (UID: \"c4552494-143e-4167-9599-7018935accc9\") " pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:50.930056 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:50.930037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4552494-143e-4167-9599-7018935accc9-tls-certs\") pod \"model-serving-api-86f7b4b499-xp9q2\" (UID: \"c4552494-143e-4167-9599-7018935accc9\") " pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:51.194553 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:51.194463 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:51.348414 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:51.348160 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-xp9q2"] Apr 17 07:58:51.350651 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:58:51.350615 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4552494_143e_4167_9599_7018935accc9.slice/crio-353fa29dab68c633ad1c19452e458f136f01c402b6bae54e78d51916020a768e WatchSource:0}: Error finding container 353fa29dab68c633ad1c19452e458f136f01c402b6bae54e78d51916020a768e: Status 404 returned error can't find the container with id 353fa29dab68c633ad1c19452e458f136f01c402b6bae54e78d51916020a768e Apr 17 07:58:51.552218 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:51.552168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-xp9q2" event={"ID":"c4552494-143e-4167-9599-7018935accc9","Type":"ContainerStarted","Data":"353fa29dab68c633ad1c19452e458f136f01c402b6bae54e78d51916020a768e"} Apr 17 07:58:51.553640 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:51.553608 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wlvqm" event={"ID":"972f2ad3-6129-4d85-9ba7-6d661a492dad","Type":"ContainerStarted","Data":"cb667999b43d4bfe2ce2258775af3aee5ca82b977ad92383da311c634a65f7ee"} Apr 17 07:58:55.573811 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:55.573772 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-xp9q2" event={"ID":"c4552494-143e-4167-9599-7018935accc9","Type":"ContainerStarted","Data":"0f6b08add564b60152afda98f8cddedb37039b4868805c6231b957932173f4d4"} Apr 17 07:58:55.574263 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:55.573890 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:58:55.575119 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:55.575097 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wlvqm" event={"ID":"972f2ad3-6129-4d85-9ba7-6d661a492dad","Type":"ContainerStarted","Data":"7297ccf2444705a2950e6612b8d5ccd27f00ff147ddd8d44aea4f184abe26d19"} Apr 17 07:58:55.575226 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:55.575147 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:58:55.591942 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:55.591898 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-xp9q2" podStartSLOduration=2.335938735 podStartE2EDuration="5.591884538s" podCreationTimestamp="2026-04-17 07:58:50 +0000 UTC" firstStartedPulling="2026-04-17 07:58:51.352668254 +0000 UTC m=+450.155618973" lastFinishedPulling="2026-04-17 07:58:54.608614041 +0000 UTC m=+453.411564776" observedRunningTime="2026-04-17 07:58:55.590872472 +0000 UTC m=+454.393823215" watchObservedRunningTime="2026-04-17 07:58:55.591884538 +0000 UTC m=+454.394835278" Apr 17 07:58:55.605860 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:58:55.605819 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-wlvqm" podStartSLOduration=1.798251705 podStartE2EDuration="5.605806665s" podCreationTimestamp="2026-04-17 07:58:50 +0000 UTC" firstStartedPulling="2026-04-17 07:58:50.746552675 +0000 UTC m=+449.549503394" lastFinishedPulling="2026-04-17 07:58:54.554107632 +0000 UTC m=+453.357058354" observedRunningTime="2026-04-17 07:58:55.605097216 +0000 UTC m=+454.408047958" watchObservedRunningTime="2026-04-17 07:58:55.605806665 +0000 UTC m=+454.408757479" Apr 17 07:59:06.581421 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:06.581391 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-wlvqm" Apr 17 07:59:06.583106 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:06.583087 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-xp9q2" Apr 17 07:59:18.610473 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.610437 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc"] Apr 17 07:59:18.613941 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.613920 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:18.616855 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.616833 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 07:59:18.620804 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.620783 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc"] Apr 17 07:59:18.778059 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.778003 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b74ea4a-94de-47cf-8896-749a1561e518-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-mpxlc\" (UID: \"1b74ea4a-94de-47cf-8896-749a1561e518\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:18.778261 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.778075 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpqn\" (UniqueName: \"kubernetes.io/projected/1b74ea4a-94de-47cf-8896-749a1561e518-kube-api-access-4wpqn\") pod \"seaweedfs-tls-custom-ddd4dbfd-mpxlc\" (UID: \"1b74ea4a-94de-47cf-8896-749a1561e518\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:18.879199 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.879116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b74ea4a-94de-47cf-8896-749a1561e518-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-mpxlc\" (UID: \"1b74ea4a-94de-47cf-8896-749a1561e518\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:18.879388 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.879253 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpqn\" (UniqueName: \"kubernetes.io/projected/1b74ea4a-94de-47cf-8896-749a1561e518-kube-api-access-4wpqn\") pod \"seaweedfs-tls-custom-ddd4dbfd-mpxlc\" (UID: \"1b74ea4a-94de-47cf-8896-749a1561e518\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:18.879581 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.879557 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b74ea4a-94de-47cf-8896-749a1561e518-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-mpxlc\" (UID: \"1b74ea4a-94de-47cf-8896-749a1561e518\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:18.887981 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.887961 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wpqn\" (UniqueName: \"kubernetes.io/projected/1b74ea4a-94de-47cf-8896-749a1561e518-kube-api-access-4wpqn\") pod \"seaweedfs-tls-custom-ddd4dbfd-mpxlc\" (UID: \"1b74ea4a-94de-47cf-8896-749a1561e518\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:18.923926 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:18.923898 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:19.052003 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:19.051975 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc"] Apr 17 07:59:19.054007 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:59:19.053978 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74ea4a_94de_47cf_8896_749a1561e518.slice/crio-2976220f3207ae81a305d5fbf8031a04eb1c726554f1e223723689960ffad651 WatchSource:0}: Error finding container 2976220f3207ae81a305d5fbf8031a04eb1c726554f1e223723689960ffad651: Status 404 returned error can't find the container with id 2976220f3207ae81a305d5fbf8031a04eb1c726554f1e223723689960ffad651 Apr 17 07:59:19.659857 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:19.659770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" event={"ID":"1b74ea4a-94de-47cf-8896-749a1561e518","Type":"ContainerStarted","Data":"6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab"} Apr 17 07:59:19.659857 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:19.659809 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" event={"ID":"1b74ea4a-94de-47cf-8896-749a1561e518","Type":"ContainerStarted","Data":"2976220f3207ae81a305d5fbf8031a04eb1c726554f1e223723689960ffad651"} Apr 17 07:59:19.677218 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:19.677163 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" podStartSLOduration=1.389606823 podStartE2EDuration="1.677149365s" podCreationTimestamp="2026-04-17 07:59:18 +0000 UTC" firstStartedPulling="2026-04-17 07:59:19.055137434 +0000 UTC m=+477.858088153" lastFinishedPulling="2026-04-17 07:59:19.342679974 +0000 UTC m=+478.145630695" observedRunningTime="2026-04-17 07:59:19.675131689 +0000 UTC m=+478.478082432" watchObservedRunningTime="2026-04-17 07:59:19.677149365 +0000 UTC m=+478.480100105" Apr 17 07:59:20.283329 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:20.283274 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc"] Apr 17 07:59:21.668123 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:21.668081 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" podUID="1b74ea4a-94de-47cf-8896-749a1561e518" containerName="seaweedfs-tls-custom" containerID="cri-o://6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab" gracePeriod=30 Apr 17 07:59:23.994575 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:23.994525 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cbb94d8cc-rksh9"] Apr 17 07:59:23.999866 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:23.999835 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.013943 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.013917 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cbb94d8cc-rksh9"] Apr 17 07:59:24.124906 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.124872 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-service-ca\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.124906 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.124919 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0b000a1-3cef-4304-9fdc-8848f27be403-console-oauth-config\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.125166 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.124997 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-trusted-ca-bundle\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.125166 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.125080 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db28k\" (UniqueName: \"kubernetes.io/projected/f0b000a1-3cef-4304-9fdc-8848f27be403-kube-api-access-db28k\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.125166 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.125109 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b000a1-3cef-4304-9fdc-8848f27be403-console-serving-cert\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.125166 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.125132 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-console-config\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.125383 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.125209 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-oauth-serving-cert\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226098 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226062 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-service-ca\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226277 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0b000a1-3cef-4304-9fdc-8848f27be403-console-oauth-config\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226277 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226150 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-trusted-ca-bundle\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226277 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226189 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db28k\" (UniqueName: \"kubernetes.io/projected/f0b000a1-3cef-4304-9fdc-8848f27be403-kube-api-access-db28k\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226277 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226213 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b000a1-3cef-4304-9fdc-8848f27be403-console-serving-cert\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226277 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226241 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-console-config\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226565 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-oauth-serving-cert\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226981 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226901 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-service-ca\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.226981 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.226939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-console-config\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.227396 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.227375 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-trusted-ca-bundle\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.227533 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.227511 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0b000a1-3cef-4304-9fdc-8848f27be403-oauth-serving-cert\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.228802 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.228771 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0b000a1-3cef-4304-9fdc-8848f27be403-console-oauth-config\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.229038 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.229017 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b000a1-3cef-4304-9fdc-8848f27be403-console-serving-cert\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.235660 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.235638 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db28k\" (UniqueName: \"kubernetes.io/projected/f0b000a1-3cef-4304-9fdc-8848f27be403-kube-api-access-db28k\") pod \"console-5cbb94d8cc-rksh9\" (UID: \"f0b000a1-3cef-4304-9fdc-8848f27be403\") " pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.309675 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.309640 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:24.435175 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.435142 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cbb94d8cc-rksh9"] Apr 17 07:59:24.681488 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.681411 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cbb94d8cc-rksh9" event={"ID":"f0b000a1-3cef-4304-9fdc-8848f27be403","Type":"ContainerStarted","Data":"1930b3f10ab43ec4c408e6f2a41e1b8cfa08ac76a43152fce0459d30595f8794"} Apr 17 07:59:24.681488 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.681446 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cbb94d8cc-rksh9" event={"ID":"f0b000a1-3cef-4304-9fdc-8848f27be403","Type":"ContainerStarted","Data":"09da8c34f1caaa29c9816822261c6a7fe8e4371365223d6698da4196c8aea00b"} Apr 17 07:59:24.699980 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:24.699922 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cbb94d8cc-rksh9" podStartSLOduration=1.699902213 podStartE2EDuration="1.699902213s" podCreationTimestamp="2026-04-17 07:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:59:24.69904834 +0000 UTC m=+483.501999080" watchObservedRunningTime="2026-04-17 07:59:24.699902213 +0000 UTC m=+483.502852956" Apr 17 07:59:34.310479 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:34.310442 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:34.310479 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:34.310485 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:34.315508 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:34.315484 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:34.724008 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:34.723920 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cbb94d8cc-rksh9" Apr 17 07:59:34.785574 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:34.785541 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-db59969c6-c57gh"] Apr 17 07:59:49.894211 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:59:49.894160 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74ea4a_94de_47cf_8896_749a1561e518.slice/crio-conmon-6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74ea4a_94de_47cf_8896_749a1561e518.slice/crio-6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab.scope\": RecentStats: unable to find data in memory cache]" Apr 17 07:59:49.897317 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:59:49.897261 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74ea4a_94de_47cf_8896_749a1561e518.slice/crio-6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74ea4a_94de_47cf_8896_749a1561e518.slice/crio-conmon-6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab.scope\": RecentStats: unable to find data in memory cache]" Apr 17 07:59:49.897456 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:59:49.896722 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74ea4a_94de_47cf_8896_749a1561e518.slice/crio-conmon-6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74ea4a_94de_47cf_8896_749a1561e518.slice/crio-6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab.scope\": RecentStats: unable to find data in memory cache]" Apr 17 07:59:50.006846 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.006822 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:50.043189 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.043156 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wpqn\" (UniqueName: \"kubernetes.io/projected/1b74ea4a-94de-47cf-8896-749a1561e518-kube-api-access-4wpqn\") pod \"1b74ea4a-94de-47cf-8896-749a1561e518\" (UID: \"1b74ea4a-94de-47cf-8896-749a1561e518\") " Apr 17 07:59:50.043371 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.043215 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b74ea4a-94de-47cf-8896-749a1561e518-data\") pod \"1b74ea4a-94de-47cf-8896-749a1561e518\" (UID: \"1b74ea4a-94de-47cf-8896-749a1561e518\") " Apr 17 07:59:50.044552 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.044523 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b74ea4a-94de-47cf-8896-749a1561e518-data" (OuterVolumeSpecName: "data") pod "1b74ea4a-94de-47cf-8896-749a1561e518" (UID: "1b74ea4a-94de-47cf-8896-749a1561e518"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:59:50.045665 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.045639 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b74ea4a-94de-47cf-8896-749a1561e518-kube-api-access-4wpqn" (OuterVolumeSpecName: "kube-api-access-4wpqn") pod "1b74ea4a-94de-47cf-8896-749a1561e518" (UID: "1b74ea4a-94de-47cf-8896-749a1561e518"). InnerVolumeSpecName "kube-api-access-4wpqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:59:50.144961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.144869 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4wpqn\" (UniqueName: \"kubernetes.io/projected/1b74ea4a-94de-47cf-8896-749a1561e518-kube-api-access-4wpqn\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:59:50.144961 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.144904 2567 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b74ea4a-94de-47cf-8896-749a1561e518-data\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 07:59:50.783553 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.783515 2567 generic.go:358] "Generic (PLEG): container finished" podID="1b74ea4a-94de-47cf-8896-749a1561e518" containerID="6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab" exitCode=0 Apr 17 07:59:50.783797 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.783570 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" Apr 17 07:59:50.783797 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.783577 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" event={"ID":"1b74ea4a-94de-47cf-8896-749a1561e518","Type":"ContainerDied","Data":"6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab"} Apr 17 07:59:50.783797 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.783613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc" event={"ID":"1b74ea4a-94de-47cf-8896-749a1561e518","Type":"ContainerDied","Data":"2976220f3207ae81a305d5fbf8031a04eb1c726554f1e223723689960ffad651"} Apr 17 07:59:50.783797 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.783633 2567 scope.go:117] "RemoveContainer" containerID="6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab" Apr 17 07:59:50.793198 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.793176 2567 scope.go:117] "RemoveContainer" containerID="6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab" Apr 17 07:59:50.793483 ip-10-0-130-28 kubenswrapper[2567]: E0417 07:59:50.793462 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab\": container with ID starting with 6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab not found: ID does not exist" containerID="6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab" Apr 17 07:59:50.793583 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.793490 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab"} err="failed to get container status \"6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab\": rpc error: code = NotFound desc = could not find container \"6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab\": container with ID starting with 6f20d21bed987ba6d57006df9274347e81c1efe0c650660580031e05b2a8e3ab not found: ID does not exist" Apr 17 07:59:50.811832 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.811757 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc"] Apr 17 07:59:50.814414 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.814389 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-mpxlc"] Apr 17 07:59:50.850148 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.850114 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb"] Apr 17 07:59:50.850587 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.850571 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b74ea4a-94de-47cf-8896-749a1561e518" containerName="seaweedfs-tls-custom" Apr 17 07:59:50.850637 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.850588 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b74ea4a-94de-47cf-8896-749a1561e518" containerName="seaweedfs-tls-custom" Apr 17 07:59:50.850687 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.850644 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b74ea4a-94de-47cf-8896-749a1561e518" containerName="seaweedfs-tls-custom" Apr 17 07:59:50.855404 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.855386 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:50.857993 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.857976 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 07:59:50.858159 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.858141 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 17 07:59:50.868124 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.868101 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb"] Apr 17 07:59:50.952699 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.952659 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6w7f\" (UniqueName: \"kubernetes.io/projected/e5a78afb-f699-4914-a9d7-5e211d44043d-kube-api-access-z6w7f\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:50.953093 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.952722 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/e5a78afb-f699-4914-a9d7-5e211d44043d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:50.953093 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:50.952748 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e5a78afb-f699-4914-a9d7-5e211d44043d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:51.053510 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.053420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6w7f\" (UniqueName: \"kubernetes.io/projected/e5a78afb-f699-4914-a9d7-5e211d44043d-kube-api-access-z6w7f\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:51.053510 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.053485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/e5a78afb-f699-4914-a9d7-5e211d44043d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:51.053739 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.053513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e5a78afb-f699-4914-a9d7-5e211d44043d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:51.053960 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.053936 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e5a78afb-f699-4914-a9d7-5e211d44043d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:51.056125 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.056104 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/e5a78afb-f699-4914-a9d7-5e211d44043d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:51.062520 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.062502 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6w7f\" (UniqueName: \"kubernetes.io/projected/e5a78afb-f699-4914-a9d7-5e211d44043d-kube-api-access-z6w7f\") pod \"seaweedfs-tls-custom-5c88b85bb7-mdclb\" (UID: \"e5a78afb-f699-4914-a9d7-5e211d44043d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:51.164766 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.164727 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" Apr 17 07:59:51.294192 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.294167 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb"] Apr 17 07:59:51.297003 ip-10-0-130-28 kubenswrapper[2567]: W0417 07:59:51.296974 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a78afb_f699_4914_a9d7_5e211d44043d.slice/crio-07288754a6e2524f10351594efb9b39a620926596d16ec8cf214bb952b388049 WatchSource:0}: Error finding container 07288754a6e2524f10351594efb9b39a620926596d16ec8cf214bb952b388049: Status 404 returned error can't find the container with id 07288754a6e2524f10351594efb9b39a620926596d16ec8cf214bb952b388049 Apr 17 07:59:51.783178 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.783136 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b74ea4a-94de-47cf-8896-749a1561e518" path="/var/lib/kubelet/pods/1b74ea4a-94de-47cf-8896-749a1561e518/volumes" Apr 17 07:59:51.787898 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.787872 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" event={"ID":"e5a78afb-f699-4914-a9d7-5e211d44043d","Type":"ContainerStarted","Data":"1c3f674c64ce0de984c68857f39bd5e446349ef04738bed978b676473367c061"} Apr 17 07:59:51.787898 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.787901 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" event={"ID":"e5a78afb-f699-4914-a9d7-5e211d44043d","Type":"ContainerStarted","Data":"07288754a6e2524f10351594efb9b39a620926596d16ec8cf214bb952b388049"} Apr 17 07:59:51.841424 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:51.841308 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mdclb" podStartSLOduration=1.5715714570000001 podStartE2EDuration="1.841281679s" podCreationTimestamp="2026-04-17 07:59:50 +0000 UTC" firstStartedPulling="2026-04-17 07:59:51.29850695 +0000 UTC m=+510.101457672" lastFinishedPulling="2026-04-17 07:59:51.56821716 +0000 UTC m=+510.371167894" observedRunningTime="2026-04-17 07:59:51.839393971 +0000 UTC m=+510.642344712" watchObservedRunningTime="2026-04-17 07:59:51.841281679 +0000 UTC m=+510.644232419" Apr 17 07:59:59.602222 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.602134 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm"] Apr 17 07:59:59.606877 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.606859 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.608992 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.608974 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 17 07:59:59.609060 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.608976 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 17 07:59:59.612132 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.612109 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm"] Apr 17 07:59:59.730315 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.730273 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljqv6\" (UniqueName: \"kubernetes.io/projected/851cebd7-9b55-4a3f-8067-2de2b32dd855-kube-api-access-ljqv6\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.730481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.730405 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/851cebd7-9b55-4a3f-8067-2de2b32dd855-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.730481 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.730439 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/851cebd7-9b55-4a3f-8067-2de2b32dd855-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.806386 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.806352 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-db59969c6-c57gh" podUID="6a8364f9-20d1-4c3d-bd17-94f7f0041431" containerName="console" containerID="cri-o://d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce" gracePeriod=15 Apr 17 07:59:59.831128 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.831106 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljqv6\" (UniqueName: \"kubernetes.io/projected/851cebd7-9b55-4a3f-8067-2de2b32dd855-kube-api-access-ljqv6\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.831220 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.831163 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/851cebd7-9b55-4a3f-8067-2de2b32dd855-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.831220 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.831178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/851cebd7-9b55-4a3f-8067-2de2b32dd855-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.831500 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.831484 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/851cebd7-9b55-4a3f-8067-2de2b32dd855-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.833579 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.833554 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/851cebd7-9b55-4a3f-8067-2de2b32dd855-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.838567 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.838546 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljqv6\" (UniqueName: \"kubernetes.io/projected/851cebd7-9b55-4a3f-8067-2de2b32dd855-kube-api-access-ljqv6\") pod \"seaweedfs-tls-serving-7fd5766db9-fsxpm\" (UID: \"851cebd7-9b55-4a3f-8067-2de2b32dd855\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 07:59:59.917892 ip-10-0-130-28 kubenswrapper[2567]: I0417 07:59:59.917839 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" Apr 17 08:00:00.069004 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.068982 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-db59969c6-c57gh_6a8364f9-20d1-4c3d-bd17-94f7f0041431/console/0.log" Apr 17 08:00:00.069111 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.069042 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-db59969c6-c57gh" Apr 17 08:00:00.133220 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133191 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-config\") pod \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " Apr 17 08:00:00.133399 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133247 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-serving-cert\") pod \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " Apr 17 08:00:00.133399 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133273 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-oauth-serving-cert\") pod \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " Apr 17 08:00:00.133399 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133344 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-oauth-config\") pod \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " Apr 17 08:00:00.133399 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133361 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-trusted-ca-bundle\") pod \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " Apr 17 08:00:00.133399 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133388 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-service-ca\") pod \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " Apr 17 08:00:00.133731 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133415 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9spf\" (UniqueName: \"kubernetes.io/projected/6a8364f9-20d1-4c3d-bd17-94f7f0041431-kube-api-access-s9spf\") pod \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\" (UID: \"6a8364f9-20d1-4c3d-bd17-94f7f0041431\") " Apr 17 08:00:00.133731 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133665 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-config" (OuterVolumeSpecName: "console-config") pod "6a8364f9-20d1-4c3d-bd17-94f7f0041431" (UID: "6a8364f9-20d1-4c3d-bd17-94f7f0041431"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:00.133865 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133832 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6a8364f9-20d1-4c3d-bd17-94f7f0041431" (UID: "6a8364f9-20d1-4c3d-bd17-94f7f0041431"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:00.133930 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133888 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-service-ca" (OuterVolumeSpecName: "service-ca") pod "6a8364f9-20d1-4c3d-bd17-94f7f0041431" (UID: "6a8364f9-20d1-4c3d-bd17-94f7f0041431"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:00.133984 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.133918 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6a8364f9-20d1-4c3d-bd17-94f7f0041431" (UID: "6a8364f9-20d1-4c3d-bd17-94f7f0041431"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:00.135708 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.135669 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6a8364f9-20d1-4c3d-bd17-94f7f0041431" (UID: "6a8364f9-20d1-4c3d-bd17-94f7f0041431"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:00:00.135708 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.135697 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8364f9-20d1-4c3d-bd17-94f7f0041431-kube-api-access-s9spf" (OuterVolumeSpecName: "kube-api-access-s9spf") pod "6a8364f9-20d1-4c3d-bd17-94f7f0041431" (UID: "6a8364f9-20d1-4c3d-bd17-94f7f0041431"). InnerVolumeSpecName "kube-api-access-s9spf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:00:00.135848 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.135755 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6a8364f9-20d1-4c3d-bd17-94f7f0041431" (UID: "6a8364f9-20d1-4c3d-bd17-94f7f0041431"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:00:00.234339 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.234269 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-oauth-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 08:00:00.234339 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.234313 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-trusted-ca-bundle\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 08:00:00.234339 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.234322 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-service-ca\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 08:00:00.234339 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.234332 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s9spf\" (UniqueName: \"kubernetes.io/projected/6a8364f9-20d1-4c3d-bd17-94f7f0041431-kube-api-access-s9spf\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 08:00:00.234339 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.234341 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-config\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 08:00:00.234561 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.234350 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a8364f9-20d1-4c3d-bd17-94f7f0041431-console-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 08:00:00.234561 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.234358 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a8364f9-20d1-4c3d-bd17-94f7f0041431-oauth-serving-cert\") on node \"ip-10-0-130-28.ec2.internal\" DevicePath \"\"" Apr 17 08:00:00.262977 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.262956 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm"] Apr 17 08:00:00.265037 ip-10-0-130-28 kubenswrapper[2567]: W0417 08:00:00.265013 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod851cebd7_9b55_4a3f_8067_2de2b32dd855.slice/crio-bd3e3b025222837407b12df234c6dcc21c1280e858e55d3e1f3b1f46c425e332 WatchSource:0}: Error finding container bd3e3b025222837407b12df234c6dcc21c1280e858e55d3e1f3b1f46c425e332: Status 404 returned error can't find the container with id bd3e3b025222837407b12df234c6dcc21c1280e858e55d3e1f3b1f46c425e332 Apr 17 08:00:00.822621 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.822587 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" event={"ID":"851cebd7-9b55-4a3f-8067-2de2b32dd855","Type":"ContainerStarted","Data":"bd3e3b025222837407b12df234c6dcc21c1280e858e55d3e1f3b1f46c425e332"} Apr 17 08:00:00.823783 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.823769 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-db59969c6-c57gh_6a8364f9-20d1-4c3d-bd17-94f7f0041431/console/0.log" Apr 17 08:00:00.823860 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.823801 2567 generic.go:358] "Generic (PLEG): container finished" podID="6a8364f9-20d1-4c3d-bd17-94f7f0041431" containerID="d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce" exitCode=2 Apr 17 08:00:00.823860 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.823827 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-db59969c6-c57gh" event={"ID":"6a8364f9-20d1-4c3d-bd17-94f7f0041431","Type":"ContainerDied","Data":"d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce"} Apr 17 08:00:00.823860 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.823859 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-db59969c6-c57gh" event={"ID":"6a8364f9-20d1-4c3d-bd17-94f7f0041431","Type":"ContainerDied","Data":"5dd27097bde8b1cc1d26c31a1f9826817d974a72f5464a558d8847d44218d1be"} Apr 17 08:00:00.823973 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.823867 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-db59969c6-c57gh" Apr 17 08:00:00.823973 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.823873 2567 scope.go:117] "RemoveContainer" containerID="d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce" Apr 17 08:00:00.832377 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.832358 2567 scope.go:117] "RemoveContainer" containerID="d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce" Apr 17 08:00:00.832611 ip-10-0-130-28 kubenswrapper[2567]: E0417 08:00:00.832589 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce\": container with ID starting with d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce not found: ID does not exist" containerID="d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce" Apr 17 08:00:00.832672 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.832623 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce"} err="failed to get container status \"d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce\": rpc error: code = NotFound desc = could not find container \"d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce\": container with ID starting with d381fc7e02a280f3f47273040799d6f802dbc245cae2f301446589545badc1ce not found: ID does not exist" Apr 17 08:00:00.845069 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.845041 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-db59969c6-c57gh"] Apr 17 08:00:00.848080 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:00.848058 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-db59969c6-c57gh"] Apr 17 08:00:01.784512 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:01.784451 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8364f9-20d1-4c3d-bd17-94f7f0041431" path="/var/lib/kubelet/pods/6a8364f9-20d1-4c3d-bd17-94f7f0041431/volumes" Apr 17 08:00:01.829240 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:01.829203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" event={"ID":"851cebd7-9b55-4a3f-8067-2de2b32dd855","Type":"ContainerStarted","Data":"096178eea914c9fa2c01a2664b7088136068feb408f4e7c4921763b4f218151a"} Apr 17 08:00:01.846209 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:00:01.846131 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fsxpm" podStartSLOduration=1.992163619 podStartE2EDuration="2.846107375s" podCreationTimestamp="2026-04-17 07:59:59 +0000 UTC" firstStartedPulling="2026-04-17 08:00:00.266325829 +0000 UTC m=+519.069276549" lastFinishedPulling="2026-04-17 08:00:01.120269586 +0000 UTC m=+519.923220305" observedRunningTime="2026-04-17 08:00:01.843203559 +0000 UTC m=+520.646154301" watchObservedRunningTime="2026-04-17 08:00:01.846107375 +0000 UTC m=+520.649058119" Apr 17 08:01:21.728302 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:01:21.728259 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:01:21.728826 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:01:21.728705 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:01:21.734596 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:01:21.734564 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:01:21.734991 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:01:21.734966 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:03:08.391420 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.391347 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7"] Apr 17 08:03:08.391856 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.391739 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a8364f9-20d1-4c3d-bd17-94f7f0041431" containerName="console" Apr 17 08:03:08.391856 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.391751 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8364f9-20d1-4c3d-bd17-94f7f0041431" containerName="console" Apr 17 08:03:08.391856 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.391825 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a8364f9-20d1-4c3d-bd17-94f7f0041431" containerName="console" Apr 17 08:03:08.394764 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.394739 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" Apr 17 08:03:08.397484 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.397463 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9pcqp\"" Apr 17 08:03:08.402901 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.402876 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7"] Apr 17 08:03:08.406040 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.406022 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" Apr 17 08:03:08.533888 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.533867 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7"] Apr 17 08:03:08.536026 ip-10-0-130-28 kubenswrapper[2567]: W0417 08:03:08.535994 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb248aa7d_6e4f_43f2_9713_103cd7bfc639.slice/crio-9efeaf2a6d68a10441c8fc87bc086154dbd3380141960f761b4446720c25f694 WatchSource:0}: Error finding container 9efeaf2a6d68a10441c8fc87bc086154dbd3380141960f761b4446720c25f694: Status 404 returned error can't find the container with id 9efeaf2a6d68a10441c8fc87bc086154dbd3380141960f761b4446720c25f694 Apr 17 08:03:08.537859 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:08.537840 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:03:09.497969 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:09.497936 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" event={"ID":"b248aa7d-6e4f-43f2-9713-103cd7bfc639","Type":"ContainerStarted","Data":"9efeaf2a6d68a10441c8fc87bc086154dbd3380141960f761b4446720c25f694"} Apr 17 08:03:10.503011 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:10.502970 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" event={"ID":"b248aa7d-6e4f-43f2-9713-103cd7bfc639","Type":"ContainerStarted","Data":"a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707"} Apr 17 08:03:10.503455 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:10.503132 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" Apr 17 08:03:10.504887 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:10.504864 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" Apr 17 08:03:10.519300 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:03:10.519241 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" podStartSLOduration=1.522826556 podStartE2EDuration="2.519229648s" podCreationTimestamp="2026-04-17 08:03:08 +0000 UTC" firstStartedPulling="2026-04-17 08:03:08.538029825 +0000 UTC m=+707.340980546" lastFinishedPulling="2026-04-17 08:03:09.534432902 +0000 UTC m=+708.337383638" observedRunningTime="2026-04-17 08:03:10.517433865 +0000 UTC m=+709.320384603" watchObservedRunningTime="2026-04-17 08:03:10.519229648 +0000 UTC m=+709.322180454" Apr 17 08:04:33.506009 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:33.505973 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-pxcq7_b248aa7d-6e4f-43f2-9713-103cd7bfc639/kserve-container/0.log" Apr 17 08:04:33.840307 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:33.840199 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7"] Apr 17 08:04:33.840520 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:33.840472 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" podUID="b248aa7d-6e4f-43f2-9713-103cd7bfc639" containerName="kserve-container" containerID="cri-o://a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707" gracePeriod=30 Apr 17 08:04:34.095835 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.095760 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" Apr 17 08:04:34.817842 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.817809 2567 generic.go:358] "Generic (PLEG): container finished" podID="b248aa7d-6e4f-43f2-9713-103cd7bfc639" containerID="a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707" exitCode=2 Apr 17 08:04:34.818359 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.817880 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" Apr 17 08:04:34.818359 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.817901 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" event={"ID":"b248aa7d-6e4f-43f2-9713-103cd7bfc639","Type":"ContainerDied","Data":"a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707"} Apr 17 08:04:34.818359 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.817941 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7" event={"ID":"b248aa7d-6e4f-43f2-9713-103cd7bfc639","Type":"ContainerDied","Data":"9efeaf2a6d68a10441c8fc87bc086154dbd3380141960f761b4446720c25f694"} Apr 17 08:04:34.818359 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.817963 2567 scope.go:117] "RemoveContainer" containerID="a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707" Apr 17 08:04:34.827384 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.827279 2567 scope.go:117] "RemoveContainer" containerID="a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707" Apr 17 08:04:34.827630 ip-10-0-130-28 kubenswrapper[2567]: E0417 08:04:34.827611 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707\": container with ID starting with a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707 not found: ID does not exist" containerID="a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707" Apr 17 08:04:34.827682 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.827639 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707"} err="failed to get container status \"a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707\": rpc error: code = NotFound desc = could not find container \"a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707\": container with ID starting with a50fc75d93301344fab45108168f38317563793bf7b440322dd966af85ebd707 not found: ID does not exist" Apr 17 08:04:34.838757 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.838731 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7"] Apr 17 08:04:34.841760 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:34.841739 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-pxcq7"] Apr 17 08:04:35.783465 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:04:35.783434 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b248aa7d-6e4f-43f2-9713-103cd7bfc639" path="/var/lib/kubelet/pods/b248aa7d-6e4f-43f2-9713-103cd7bfc639/volumes" Apr 17 08:06:21.753926 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:06:21.753898 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:06:21.756233 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:06:21.756210 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:06:21.760023 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:06:21.760007 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:06:21.762331 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:06:21.762317 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:11:21.781506 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:11:21.781476 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:11:21.786074 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:11:21.786050 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:11:21.796798 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:11:21.796773 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:11:21.800894 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:11:21.800875 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:16:21.817458 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:16:21.817416 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:16:21.822683 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:16:21.822659 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:16:21.824171 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:16:21.824151 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:16:21.828923 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:16:21.828907 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:21:21.847487 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:21:21.847456 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:21:21.853336 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:21:21.853312 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:21:21.853714 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:21:21.853697 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:21:21.859588 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:21:21.859572 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:26:21.878305 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:26:21.878233 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:26:21.884167 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:26:21.884139 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:26:21.884936 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:26:21.884916 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:26:21.890089 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:26:21.890054 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:31:21.906057 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:31:21.906027 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:31:21.912598 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:31:21.912566 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:31:21.912829 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:31:21.912805 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:31:21.925532 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:31:21.925507 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:36:21.935089 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:36:21.935058 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:36:21.942014 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:36:21.941985 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:36:21.947917 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:36:21.947892 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:36:21.954402 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:36:21.954380 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:41:21.962821 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:41:21.962779 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:41:21.969819 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:41:21.969781 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:41:21.976625 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:41:21.976604 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:41:21.982532 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:41:21.982513 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:46:21.989577 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:46:21.989537 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:46:21.996984 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:46:21.996960 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:46:22.005350 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:46:22.005328 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:46:22.011668 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:46:22.011646 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:51:09.031877 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:09.031839 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m6m2c_6bddba28-b3c8-46bd-bb8e-166c73a7acbe/global-pull-secret-syncer/0.log" Apr 17 08:51:09.133576 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:09.133548 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vpzvb_187a524e-8b69-4904-8985-6b33cf3dc3d1/konnectivity-agent/0.log" Apr 17 08:51:09.188915 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:09.188886 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-28.ec2.internal_90295ff52276f4357ebdbd2fe3bf0ff4/haproxy/0.log" Apr 17 08:51:12.430489 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.430457 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d4a32f9-181c-4398-82b0-3b1cf0ab3e87/alertmanager/0.log" Apr 17 08:51:12.452896 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.452869 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d4a32f9-181c-4398-82b0-3b1cf0ab3e87/config-reloader/0.log" Apr 17 08:51:12.479098 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.479073 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d4a32f9-181c-4398-82b0-3b1cf0ab3e87/kube-rbac-proxy-web/0.log" Apr 17 08:51:12.501950 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.501928 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d4a32f9-181c-4398-82b0-3b1cf0ab3e87/kube-rbac-proxy/0.log" Apr 17 08:51:12.524265 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.524243 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d4a32f9-181c-4398-82b0-3b1cf0ab3e87/kube-rbac-proxy-metric/0.log" Apr 17 08:51:12.551526 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.551506 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d4a32f9-181c-4398-82b0-3b1cf0ab3e87/prom-label-proxy/0.log" Apr 17 08:51:12.579053 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.579026 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d4a32f9-181c-4398-82b0-3b1cf0ab3e87/init-config-reloader/0.log" Apr 17 08:51:12.627660 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.627620 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-m4cqz_6ed2baca-4a17-4906-9829-56274b0374d5/cluster-monitoring-operator/0.log" Apr 17 08:51:12.650453 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.650426 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hvztr_2bb4af48-7fc1-4da0-96dd-46c44953d2d1/kube-state-metrics/0.log" Apr 17 08:51:12.669148 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.669117 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hvztr_2bb4af48-7fc1-4da0-96dd-46c44953d2d1/kube-rbac-proxy-main/0.log" Apr 17 08:51:12.689088 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.689002 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hvztr_2bb4af48-7fc1-4da0-96dd-46c44953d2d1/kube-rbac-proxy-self/0.log" Apr 17 08:51:12.716820 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.716793 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7c648db48d-fm8f7_29d6bf46-15ba-4280-8c5e-c80fe3427b1d/metrics-server/0.log" Apr 17 08:51:12.849524 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.849492 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fdls8_992a09a6-6ee8-42d1-b1cc-ddac80952b0d/node-exporter/0.log" Apr 17 08:51:12.870615 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.870591 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fdls8_992a09a6-6ee8-42d1-b1cc-ddac80952b0d/kube-rbac-proxy/0.log" Apr 17 08:51:12.894819 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:12.894796 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fdls8_992a09a6-6ee8-42d1-b1cc-ddac80952b0d/init-textfile/0.log" Apr 17 08:51:13.244768 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.244735 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hkkmn_01f11896-aba0-4324-a913-4bc5ab88a7d4/prometheus-operator/0.log" Apr 17 08:51:13.266990 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.266964 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hkkmn_01f11896-aba0-4324-a913-4bc5ab88a7d4/kube-rbac-proxy/0.log" Apr 17 08:51:13.291250 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.291221 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-w6z6h_c586da6e-7ea7-4dbc-bc58-bec868051781/prometheus-operator-admission-webhook/0.log" Apr 17 08:51:13.388965 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.388932 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-547b55c77c-mkwgp_ed049ead-6ee3-4a75-945e-6168dd530b2c/thanos-query/0.log" Apr 17 08:51:13.408046 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.408020 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-547b55c77c-mkwgp_ed049ead-6ee3-4a75-945e-6168dd530b2c/kube-rbac-proxy-web/0.log" Apr 17 08:51:13.427667 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.427642 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-547b55c77c-mkwgp_ed049ead-6ee3-4a75-945e-6168dd530b2c/kube-rbac-proxy/0.log" Apr 17 08:51:13.447007 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.446983 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-547b55c77c-mkwgp_ed049ead-6ee3-4a75-945e-6168dd530b2c/prom-label-proxy/0.log" Apr 17 08:51:13.469244 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.469216 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-547b55c77c-mkwgp_ed049ead-6ee3-4a75-945e-6168dd530b2c/kube-rbac-proxy-rules/0.log" Apr 17 08:51:13.490035 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:13.490003 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-547b55c77c-mkwgp_ed049ead-6ee3-4a75-945e-6168dd530b2c/kube-rbac-proxy-metrics/0.log" Apr 17 08:51:14.602839 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:14.602805 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-22l88_0aec996c-f5f7-4af3-8685-2febd74582db/networking-console-plugin/0.log" Apr 17 08:51:15.009624 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:15.009593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:51:15.020484 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:15.020431 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/2.log" Apr 17 08:51:15.380315 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:15.380261 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cbb94d8cc-rksh9_f0b000a1-3cef-4304-9fdc-8848f27be403/console/0.log" Apr 17 08:51:15.419246 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:15.419219 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-tw9dk_7d7c2a20-730c-49bd-ac08-496b110637bd/download-server/0.log" Apr 17 08:51:15.789558 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:15.789531 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-7xz9b_13e57040-8bcb-45c5-9813-b2b4749fd4e4/volume-data-source-validator/0.log" Apr 17 08:51:16.054696 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.054608 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg"] Apr 17 08:51:16.055008 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.054996 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b248aa7d-6e4f-43f2-9713-103cd7bfc639" containerName="kserve-container" Apr 17 08:51:16.055049 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.055010 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b248aa7d-6e4f-43f2-9713-103cd7bfc639" containerName="kserve-container" Apr 17 08:51:16.055100 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.055089 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b248aa7d-6e4f-43f2-9713-103cd7bfc639" containerName="kserve-container" Apr 17 08:51:16.058698 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.058677 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.061233 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.061205 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jbj8t\"/\"default-dockercfg-rq654\"" Apr 17 08:51:16.061233 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.061233 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jbj8t\"/\"kube-root-ca.crt\"" Apr 17 08:51:16.062105 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.062069 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jbj8t\"/\"openshift-service-ca.crt\"" Apr 17 08:51:16.072569 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.072546 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg"] Apr 17 08:51:16.133982 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.133935 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-podres\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.133982 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.133995 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-sys\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.134208 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.134017 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn289\" (UniqueName: \"kubernetes.io/projected/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-kube-api-access-vn289\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.134208 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.134072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-proc\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.134208 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.134100 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-lib-modules\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235005 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.234968 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-proc\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235210 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.235016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-lib-modules\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235210 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.235073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-podres\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235210 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.235099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-sys\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235210 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.235099 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-proc\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235210 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.235127 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn289\" (UniqueName: \"kubernetes.io/projected/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-kube-api-access-vn289\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235210 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.235164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-sys\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235494 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.235232 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-podres\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.235494 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.235241 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-lib-modules\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.243499 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.243478 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn289\" (UniqueName: \"kubernetes.io/projected/8d6a6d91-2ff4-4350-8c0b-a1862cbf306d-kube-api-access-vn289\") pod \"perf-node-gather-daemonset-klhrg\" (UID: \"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.369055 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.368964 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.427508 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.427466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mspd5_5e3cf222-71f9-4a25-88bb-37c528ac2994/dns/0.log" Apr 17 08:51:16.446089 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.446042 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mspd5_5e3cf222-71f9-4a25-88bb-37c528ac2994/kube-rbac-proxy/0.log" Apr 17 08:51:16.498833 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.498805 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg"] Apr 17 08:51:16.501824 ip-10-0-130-28 kubenswrapper[2567]: W0417 08:51:16.501796 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8d6a6d91_2ff4_4350_8c0b_a1862cbf306d.slice/crio-c82b331ee9608d2dca7a69d9a043721023efedfa039ce0e0ece30991f800bf37 WatchSource:0}: Error finding container c82b331ee9608d2dca7a69d9a043721023efedfa039ce0e0ece30991f800bf37: Status 404 returned error can't find the container with id c82b331ee9608d2dca7a69d9a043721023efedfa039ce0e0ece30991f800bf37 Apr 17 08:51:16.503486 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.503470 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:51:16.562744 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.562722 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4dz7x_bb277a7c-b922-4c78-a4fd-5882a862b97a/dns-node-resolver/0.log" Apr 17 08:51:16.851364 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.851320 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" event={"ID":"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d","Type":"ContainerStarted","Data":"d4a744638a26e3281f8b0647e4a057720ac3e63d64257f3ccc6ef8ee367f393f"} Apr 17 08:51:16.851364 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.851365 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" event={"ID":"8d6a6d91-2ff4-4350-8c0b-a1862cbf306d","Type":"ContainerStarted","Data":"c82b331ee9608d2dca7a69d9a043721023efedfa039ce0e0ece30991f800bf37"} Apr 17 08:51:16.851837 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.851459 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:16.866656 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:16.866610 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" podStartSLOduration=0.86659351 podStartE2EDuration="866.59351ms" podCreationTimestamp="2026-04-17 08:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:51:16.864718321 +0000 UTC m=+3595.667669064" watchObservedRunningTime="2026-04-17 08:51:16.86659351 +0000 UTC m=+3595.669544250" Apr 17 08:51:17.020164 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:17.020136 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ks9m9_7254920c-50ea-4fc4-b393-00fa4b69ad5b/node-ca/0.log" Apr 17 08:51:18.082169 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:18.082131 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sm78q_fcbda289-b762-45ea-ba60-5188e612db63/serve-healthcheck-canary/0.log" Apr 17 08:51:18.434756 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:18.434677 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6tcx_fab94ad2-1267-48c8-9ec7-3160b92c3f4c/kube-rbac-proxy/0.log" Apr 17 08:51:18.453095 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:18.453064 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6tcx_fab94ad2-1267-48c8-9ec7-3160b92c3f4c/exporter/0.log" Apr 17 08:51:18.472489 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:18.472457 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6tcx_fab94ad2-1267-48c8-9ec7-3160b92c3f4c/extractor/0.log" Apr 17 08:51:20.525909 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:20.525880 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-xp9q2_c4552494-143e-4167-9599-7018935accc9/server/0.log" Apr 17 08:51:20.882529 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:20.882442 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-wlvqm_972f2ad3-6129-4d85-9ba7-6d661a492dad/manager/0.log" Apr 17 08:51:20.971508 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:20.971479 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-5f85w_647fc402-09c1-4868-90f2-a04650ef09d1/seaweedfs/0.log" Apr 17 08:51:20.993581 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:20.993552 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-mdclb_e5a78afb-f699-4914-a9d7-5e211d44043d/seaweedfs-tls-custom/0.log" Apr 17 08:51:21.016170 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:21.016139 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-fsxpm_851cebd7-9b55-4a3f-8067-2de2b32dd855/seaweedfs-tls-serving/0.log" Apr 17 08:51:22.025263 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:22.025236 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:51:22.031464 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:22.031441 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:51:22.034118 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:22.034095 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnwwn_4fe916f9-75e5-450b-9686-68166482e8a8/console-operator/1.log" Apr 17 08:51:22.039639 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:22.039623 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:51:22.865595 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:22.865567 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-klhrg" Apr 17 08:51:25.135889 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:25.135789 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fwwvt_f9503e60-cd11-4c96-a718-f33e86501791/kube-storage-version-migrator-operator/1.log" Apr 17 08:51:25.137569 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:25.137542 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fwwvt_f9503e60-cd11-4c96-a718-f33e86501791/kube-storage-version-migrator-operator/0.log" Apr 17 08:51:26.255128 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.255089 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qgq6m_b560c5d8-3216-49b3-be3c-2ad93d8b4e7a/kube-multus-additional-cni-plugins/0.log" Apr 17 08:51:26.274771 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.274691 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qgq6m_b560c5d8-3216-49b3-be3c-2ad93d8b4e7a/egress-router-binary-copy/0.log" Apr 17 08:51:26.296510 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.296483 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qgq6m_b560c5d8-3216-49b3-be3c-2ad93d8b4e7a/cni-plugins/0.log" Apr 17 08:51:26.315702 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.315675 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qgq6m_b560c5d8-3216-49b3-be3c-2ad93d8b4e7a/bond-cni-plugin/0.log" Apr 17 08:51:26.334322 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.334278 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qgq6m_b560c5d8-3216-49b3-be3c-2ad93d8b4e7a/routeoverride-cni/0.log" Apr 17 08:51:26.354410 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.354386 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qgq6m_b560c5d8-3216-49b3-be3c-2ad93d8b4e7a/whereabouts-cni-bincopy/0.log" Apr 17 08:51:26.374094 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.374070 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qgq6m_b560c5d8-3216-49b3-be3c-2ad93d8b4e7a/whereabouts-cni/0.log" Apr 17 08:51:26.575938 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.575862 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-grgq6_efad7b51-5842-4aff-abb9-6379ecca5cc4/kube-multus/0.log" Apr 17 08:51:26.728612 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.728578 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nnkhx_86e593a1-ee06-4a3a-9bef-3d1c3097b01d/network-metrics-daemon/0.log" Apr 17 08:51:26.746920 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:26.746892 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nnkhx_86e593a1-ee06-4a3a-9bef-3d1c3097b01d/kube-rbac-proxy/0.log" Apr 17 08:51:27.522267 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.522235 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-controller/0.log" Apr 17 08:51:27.538355 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.538323 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/0.log" Apr 17 08:51:27.570416 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.570389 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovn-acl-logging/1.log" Apr 17 08:51:27.590584 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.590487 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/kube-rbac-proxy-node/0.log" Apr 17 08:51:27.610118 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.610087 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:51:27.625697 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.625671 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/northd/0.log" Apr 17 08:51:27.647681 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.647656 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/nbdb/0.log" Apr 17 08:51:27.669221 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.669188 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/sbdb/0.log" Apr 17 08:51:27.857935 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:27.857863 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4l4mp_a5237768-3d38-4f21-8b97-c1ffd5d7cec2/ovnkube-controller/0.log" Apr 17 08:51:29.380102 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:29.380073 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-gnk92_ecac921a-12f0-4bcc-ac34-e10db9b1ae9a/check-endpoints/0.log" Apr 17 08:51:29.400642 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:29.400612 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-gz9xl_c2f87d28-3811-45b6-bdd2-ca07124aa872/network-check-target-container/0.log" Apr 17 08:51:30.323435 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:30.323396 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zn97z_88aa5ccb-f3a4-4df2-90c3-b1450d9b5ad6/iptables-alerter/0.log" Apr 17 08:51:30.904483 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:30.904412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-cqr65_6c6bafdd-ce56-49a3-8070-bd48e97302b2/tuned/0.log" Apr 17 08:51:32.598084 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:32.598030 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qhzj4_60c87110-0aed-4648-8660-2c08620770a1/cluster-samples-operator/0.log" Apr 17 08:51:32.613569 ip-10-0-130-28 kubenswrapper[2567]: I0417 08:51:32.613542 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qhzj4_60c87110-0aed-4648-8660-2c08620770a1/cluster-samples-operator-watch/0.log"