Apr 24 19:07:06.412919 ip-10-0-131-214 systemd[1]: Starting Kubernetes Kubelet... Apr 24 19:07:06.858263 ip-10-0-131-214 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:07:06.858263 ip-10-0-131-214 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 19:07:06.858263 ip-10-0-131-214 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:07:06.858263 ip-10-0-131-214 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 19:07:06.858263 ip-10-0-131-214 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:07:06.859167 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.859076 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 19:07:06.866036 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866020 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:06.866036 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866036 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866040 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866044 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866047 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866050 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866053 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866056 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866059 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866063 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866066 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866074 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866077 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866080 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866083 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866085 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866088 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866091 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866093 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866096 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866099 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:06.866102 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866101 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866104 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866107 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866110 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866113 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866116 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866119 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866121 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866124 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866127 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866130 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866133 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866135 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866138 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866140 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866143 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866145 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866148 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866150 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866153 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:06.866571 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866155 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866157 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866160 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866163 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866165 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866170 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866174 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866176 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866179 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866182 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866184 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866187 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866189 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866191 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866195 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866198 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866200 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866203 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866205 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:06.867073 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866208 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866211 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866214 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866216 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866219 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866222 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866225 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866227 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866230 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866232 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866235 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866237 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866240 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866242 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866245 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866248 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866250 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866253 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866256 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:06.867533 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866258 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866261 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866263 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866267 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866269 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866272 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866276 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866677 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866684 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866687 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866691 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866695 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866698 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866701 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866704 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866707 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866710 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866713 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866715 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:06.868049 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866719 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866723 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866726 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866729 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866731 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866734 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866737 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866739 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866742 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866744 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866747 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866750 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866752 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866756 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866759 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866761 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866764 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866767 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866769 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866771 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:06.868505 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866774 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866777 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866780 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866783 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866785 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866788 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866790 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866794 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866796 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866799 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866801 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866804 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866806 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866809 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866811 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866813 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866816 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866819 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866821 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:06.869014 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866823 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866826 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866829 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866831 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866834 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866838 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866841 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866845 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866848 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866851 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866854 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866856 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866859 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866862 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866865 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866867 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866870 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866872 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866875 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:06.869474 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866877 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866880 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866882 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866884 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866887 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866890 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866892 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866895 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866897 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866900 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866903 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866905 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866908 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866910 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866913 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.866915 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868775 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868785 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868793 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868799 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868804 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868807 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 19:07:06.869966 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868812 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868816 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868820 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868823 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868827 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868830 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868834 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868837 2573 flags.go:64] FLAG: --cgroup-root="" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868840 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868843 2573 flags.go:64] FLAG: --client-ca-file="" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868846 2573 flags.go:64] FLAG: --cloud-config="" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868849 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868852 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868856 2573 flags.go:64] FLAG: --cluster-domain="" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868859 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868862 2573 flags.go:64] FLAG: --config-dir="" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868865 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868868 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868872 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868875 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868878 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868882 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868885 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868888 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 19:07:06.870490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868891 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868894 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868897 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868906 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868909 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868912 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868916 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868920 2573 flags.go:64] FLAG: --enable-server="true" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868923 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868927 2573 flags.go:64] FLAG: --event-burst="100" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868930 2573 flags.go:64] FLAG: --event-qps="50" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868934 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868937 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868941 2573 flags.go:64] FLAG: --eviction-hard="" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868944 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868947 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868950 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868953 2573 flags.go:64] FLAG: --eviction-soft="" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868956 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868960 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868963 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868966 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868968 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868971 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868974 2573 flags.go:64] FLAG: --feature-gates="" Apr 24 19:07:06.871163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868978 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868981 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868985 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868988 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868991 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868994 2573 flags.go:64] FLAG: --help="false" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.868997 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-131-214.ec2.internal" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869000 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869003 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869006 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869009 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869012 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869019 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869022 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869026 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869029 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869032 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869035 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869037 2573 flags.go:64] FLAG: --kube-reserved="" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869041 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869044 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869047 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869050 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869053 2573 flags.go:64] FLAG: --lock-file="" Apr 24 19:07:06.871778 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869055 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869058 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869061 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869067 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869070 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869073 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869075 2573 flags.go:64] FLAG: --logging-format="text" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869078 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869082 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869084 2573 flags.go:64] FLAG: --manifest-url="" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869087 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869091 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869095 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869099 2573 flags.go:64] FLAG: --max-pods="110" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869102 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869105 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869108 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869111 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869113 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869117 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869124 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869132 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869135 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869138 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 19:07:06.872363 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869141 2573 flags.go:64] FLAG: --pod-cidr="" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869158 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869165 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869168 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869172 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869175 2573 flags.go:64] FLAG: --port="10250" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869178 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869181 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0610375fbfed59f5a" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869184 2573 flags.go:64] FLAG: --qos-reserved="" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869188 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869191 2573 flags.go:64] FLAG: --register-node="true" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869194 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869197 2573 flags.go:64] FLAG: --register-with-taints="" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869201 2573 flags.go:64] FLAG: --registry-burst="10" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869203 2573 flags.go:64] FLAG: --registry-qps="5" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869206 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869209 2573 flags.go:64] FLAG: --reserved-memory="" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869213 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869216 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869219 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869222 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869225 2573 flags.go:64] FLAG: --runonce="false" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869228 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869231 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869234 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 24 19:07:06.873025 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869237 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869239 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869243 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869247 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869250 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869253 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869255 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869260 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869263 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869266 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869269 2573 flags.go:64] FLAG: --system-cgroups="" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869272 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869278 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869281 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869284 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869288 2573 flags.go:64] FLAG: --tls-min-version="" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869291 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869294 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869297 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869300 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869303 2573 flags.go:64] FLAG: --v="2" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869307 2573 flags.go:64] FLAG: --version="false" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869311 2573 flags.go:64] FLAG: --vmodule="" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869315 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.869319 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 19:07:06.873652 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869783 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869788 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869791 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869795 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869798 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869801 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869803 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869807 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869810 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869813 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869816 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869819 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869822 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869824 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869827 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869830 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869833 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869835 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869841 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869844 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:06.874307 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869846 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869849 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869851 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869854 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869857 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869859 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869862 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869865 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869868 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869870 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869873 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869875 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869878 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869880 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869883 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869885 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869888 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869892 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869895 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:06.874891 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869899 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869902 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869905 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869908 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869910 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869913 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869916 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869918 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869921 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869924 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869926 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869931 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869934 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869936 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869939 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869941 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869944 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869946 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869949 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869951 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:06.875421 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869954 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869956 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869959 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869961 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869964 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869966 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869969 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869972 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869975 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869978 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869980 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869983 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869986 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869988 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869991 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869993 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869996 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.869998 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870001 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870003 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:06.876015 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870006 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:06.876730 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870009 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:06.876730 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870012 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:06.876730 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870016 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:06.876730 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870019 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:06.876730 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870023 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:06.876730 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.870027 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:06.876730 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.870734 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:07:06.878728 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.878710 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 19:07:06.878728 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.878728 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878778 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878783 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878787 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878789 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878792 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878795 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878798 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878802 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878804 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878807 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878810 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878812 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878815 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878818 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878820 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878823 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:06.878820 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878826 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878829 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878832 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878835 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878838 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878841 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878843 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878846 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878849 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878852 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878855 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878858 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878860 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878863 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878866 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878868 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878871 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878873 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878876 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878878 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:06.879246 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878881 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878884 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878886 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878889 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878892 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878894 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878897 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878900 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878902 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878905 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878908 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878910 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878912 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878915 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878919 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878922 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878924 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878927 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878930 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878932 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:06.879748 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878935 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878937 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878940 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878942 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878945 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878948 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878952 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878956 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878958 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878961 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878963 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878966 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878968 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878971 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878973 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878976 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878978 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878981 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878984 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878987 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:06.880236 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878990 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878994 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.878999 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879002 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879005 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879008 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879011 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879013 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879016 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879019 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.879024 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879116 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879122 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879124 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879127 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:07:06.880734 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879130 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879133 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879135 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879138 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879141 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879144 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879146 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879149 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879151 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879154 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879157 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879160 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879162 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879165 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879167 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879170 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879173 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879176 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879179 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879182 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:07:06.881100 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879184 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879187 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879189 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879192 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879195 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879198 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879200 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879203 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879205 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879209 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879211 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879214 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879216 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879219 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879221 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879224 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879227 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879229 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879232 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879234 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:07:06.881592 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879237 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879239 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879242 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879244 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879247 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879249 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879252 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879255 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879257 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879260 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879262 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879265 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879268 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879271 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879273 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879276 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879279 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879281 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879285 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:07:06.882106 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879289 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879292 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879295 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879298 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879300 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879303 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879306 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879308 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879312 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879316 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879319 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879322 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879324 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879327 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879329 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879332 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879334 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879337 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879339 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879342 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:07:06.882580 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879344 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:07:06.883085 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879347 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:07:06.883085 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:06.879349 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:07:06.883085 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.879355 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:07:06.883085 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.880202 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 19:07:06.883085 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.882274 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 19:07:06.883362 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.883350 2573 server.go:1019] "Starting client certificate rotation" Apr 24 19:07:06.883472 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.883454 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:07:06.883505 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.883492 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:07:06.908794 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.908777 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:07:06.912803 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.912788 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:07:06.926522 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.926504 2573 log.go:25] "Validated CRI v1 runtime API" Apr 24 19:07:06.932478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.932461 2573 log.go:25] "Validated CRI v1 image API" Apr 24 19:07:06.933818 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.933800 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 19:07:06.937746 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.937727 2573 fs.go:135] Filesystem UUIDs: map[216f06fa-a70c-45d8-b7fc-3e250c7c0c55:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b258efc7-a455-4836-826a-480d66ecd656:/dev/nvme0n1p4] Apr 24 19:07:06.937799 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.937745 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 19:07:06.938384 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.938369 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:07:06.944179 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.944075 2573 manager.go:217] Machine: {Timestamp:2026-04-24 19:07:06.942027149 +0000 UTC m=+0.412186668 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107202 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec279da7b3666ba7162b1ecae367d651 SystemUUID:ec279da7-b366-6ba7-162b-1ecae367d651 BootID:8614105c-616b-41d3-80ec-13d448ca09a3 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:67:0c:51:67:17 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:67:0c:51:67:17 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:5e:2b:70:16:96 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 19:07:06.944179 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.944175 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 19:07:06.944289 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.944281 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 19:07:06.945272 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.945253 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 19:07:06.945398 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.945274 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-214.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 19:07:06.945448 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.945407 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 19:07:06.945448 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.945415 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 19:07:06.945448 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.945427 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:07:06.946905 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.946894 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:07:06.948382 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.948373 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:07:06.948494 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.948485 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 19:07:06.950896 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.950888 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 24 19:07:06.950937 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.950899 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 19:07:06.950937 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.950913 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 19:07:06.950937 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.950923 2573 kubelet.go:397] "Adding apiserver pod source" Apr 24 19:07:06.950937 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.950932 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 19:07:06.951994 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.951983 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:07:06.952035 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.952001 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:07:06.957038 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.957020 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 19:07:06.958414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.958399 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 19:07:06.960267 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960256 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960273 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960280 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960286 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960291 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960297 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960304 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960310 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960317 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 19:07:06.960326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960322 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 19:07:06.960567 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960335 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 19:07:06.960567 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.960344 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 19:07:06.961185 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.961176 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 19:07:06.961222 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.961186 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 19:07:06.964568 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.964555 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 19:07:06.964648 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.964588 2573 server.go:1295] "Started kubelet" Apr 24 19:07:06.964719 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.964688 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 19:07:06.964777 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.964695 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 19:07:06.964914 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.964756 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 19:07:06.965225 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.965201 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-214.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 19:07:06.965357 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:06.965335 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-214.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 19:07:06.965470 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:06.965452 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 19:07:06.965560 ip-10-0-131-214 systemd[1]: Started Kubernetes Kubelet. Apr 24 19:07:06.966063 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.965947 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 19:07:06.967481 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.967468 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 24 19:07:06.974753 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.974734 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 19:07:06.975349 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.975277 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 19:07:06.976216 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:06.972204 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-214.ec2.internal.18a9607f189f20d1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-214.ec2.internal,UID:ip-10-0-131-214.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-214.ec2.internal,},FirstTimestamp:2026-04-24 19:07:06.964566225 +0000 UTC m=+0.434725743,LastTimestamp:2026-04-24 19:07:06.964566225 +0000 UTC m=+0.434725743,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-214.ec2.internal,}" Apr 24 19:07:06.976386 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.976372 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 19:07:06.976597 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.976516 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 19:07:06.976716 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.976706 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 19:07:06.976927 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:06.976906 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 19:07:06.977039 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.976947 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 24 19:07:06.977139 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.977121 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 24 19:07:06.977221 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.977067 2573 factory.go:153] Registering CRI-O factory Apr 24 19:07:06.977221 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.977159 2573 factory.go:223] Registration of the crio container factory successfully Apr 24 19:07:06.977366 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.977352 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 19:07:06.977424 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.977367 2573 factory.go:55] Registering systemd factory Apr 24 19:07:06.977424 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.977376 2573 factory.go:223] Registration of the systemd container factory successfully Apr 24 19:07:06.977424 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.977400 2573 factory.go:103] Registering Raw factory Apr 24 19:07:06.977424 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.977419 2573 manager.go:1196] Started watching for new ooms in manager Apr 24 19:07:06.978416 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:06.978393 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:06.978677 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.978660 2573 manager.go:319] Starting recovery of all containers Apr 24 19:07:06.979121 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:06.979092 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-214.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 19:07:06.981189 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:06.981042 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 19:07:06.988556 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.988536 2573 manager.go:324] Recovery completed Apr 24 19:07:06.992862 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.992848 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:07:06.996741 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.996724 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:07:06.996807 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.996754 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:07:06.996807 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.996765 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:07:06.997212 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.997195 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 19:07:06.997212 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.997211 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 19:07:06.997316 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.997230 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:07:06.998980 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:06.998913 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-214.ec2.internal.18a9607f1a8a136e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-214.ec2.internal,UID:ip-10-0-131-214.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-214.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-214.ec2.internal,},FirstTimestamp:2026-04-24 19:07:06.996740974 +0000 UTC m=+0.466900493,LastTimestamp:2026-04-24 19:07:06.996740974 +0000 UTC m=+0.466900493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-214.ec2.internal,}" Apr 24 19:07:06.999561 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.999539 2573 policy_none.go:49] "None policy: Start" Apr 24 19:07:06.999561 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.999554 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 19:07:06.999561 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.999565 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 24 19:07:06.999995 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:06.999976 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gpnph" Apr 24 19:07:07.008476 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.008454 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gpnph" Apr 24 19:07:07.009959 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.009896 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-214.ec2.internal.18a9607f1a8a5cbd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-214.ec2.internal,UID:ip-10-0-131-214.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-214.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-214.ec2.internal,},FirstTimestamp:2026-04-24 19:07:06.996759741 +0000 UTC m=+0.466919259,LastTimestamp:2026-04-24 19:07:06.996759741 +0000 UTC m=+0.466919259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-214.ec2.internal,}" Apr 24 19:07:07.039247 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.039229 2573 manager.go:341] "Starting Device Plugin manager" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.039257 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.039266 2573 server.go:85] "Starting device plugin registration server" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.039493 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.039505 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.039624 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.039700 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.039706 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.040274 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 19:07:07.042217 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.040310 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.113675 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.113594 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 19:07:07.114771 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.114754 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 19:07:07.114882 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.114782 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 19:07:07.114882 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.114803 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 19:07:07.114882 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.114811 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 19:07:07.114882 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.114848 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 19:07:07.117277 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.117258 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:07.140452 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.140434 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:07:07.141365 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.141344 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:07:07.141452 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.141374 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:07:07.141452 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.141383 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:07:07.141452 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.141409 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.149668 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.149644 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.149762 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.149676 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-214.ec2.internal\": node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.169771 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.169747 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.215438 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.215382 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal"] Apr 24 19:07:07.215585 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.215490 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:07:07.216380 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.216365 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:07:07.216458 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.216392 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:07:07.216458 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.216402 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:07:07.217737 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.217723 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:07:07.217901 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.217886 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.218000 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.217921 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:07:07.218473 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.218454 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:07:07.218560 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.218482 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:07:07.218560 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.218454 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:07:07.218560 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.218492 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:07:07.218560 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.218515 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:07:07.218560 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.218533 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:07:07.219736 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.219722 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.219820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.219752 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:07:07.220402 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.220385 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:07:07.220498 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.220414 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:07:07.220498 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.220428 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:07:07.240156 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.240140 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-214.ec2.internal\" not found" node="ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.243352 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.243336 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-214.ec2.internal\" not found" node="ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.270774 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.270756 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.279065 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.279042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f268c01941fb4d77b80d374b8add70b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal\" (UID: \"9f268c01941fb4d77b80d374b8add70b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.279125 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.279069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f268c01941fb4d77b80d374b8add70b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal\" (UID: \"9f268c01941fb4d77b80d374b8add70b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.279125 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.279085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f5616a56b4d82e03f666685a7c47e3f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-214.ec2.internal\" (UID: \"3f5616a56b4d82e03f666685a7c47e3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.371070 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.370994 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.379408 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.379388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f268c01941fb4d77b80d374b8add70b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal\" (UID: \"9f268c01941fb4d77b80d374b8add70b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.379475 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.379415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f268c01941fb4d77b80d374b8add70b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal\" (UID: \"9f268c01941fb4d77b80d374b8add70b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.379475 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.379433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f5616a56b4d82e03f666685a7c47e3f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-214.ec2.internal\" (UID: \"3f5616a56b4d82e03f666685a7c47e3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.379569 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.379474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f5616a56b4d82e03f666685a7c47e3f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-214.ec2.internal\" (UID: \"3f5616a56b4d82e03f666685a7c47e3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.379569 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.379509 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f268c01941fb4d77b80d374b8add70b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal\" (UID: \"9f268c01941fb4d77b80d374b8add70b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.379569 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.379511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f268c01941fb4d77b80d374b8add70b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal\" (UID: \"9f268c01941fb4d77b80d374b8add70b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.471821 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.471781 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.542316 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.542284 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.546039 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.546020 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" Apr 24 19:07:07.572655 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.572548 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.673114 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.673084 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.773677 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.773642 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.871954 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.871931 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:07.873858 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.873842 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.883280 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.883258 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 19:07:07.883397 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.883374 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:07:07.883447 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.883410 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:07:07.950323 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.950242 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:07.974591 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:07.974557 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:07.975666 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.975654 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 19:07:07.984557 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:07.984538 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:07:08.008587 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.008556 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9sdxf" Apr 24 19:07:08.011688 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.011664 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 19:02:06 +0000 UTC" deadline="2028-01-12 13:08:06.876578392 +0000 UTC" Apr 24 19:07:08.011740 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.011689 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15066h0m58.864892305s" Apr 24 19:07:08.019510 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.019486 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9sdxf" Apr 24 19:07:08.074927 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:08.074897 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:08.158083 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:08.158043 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f5616a56b4d82e03f666685a7c47e3f.slice/crio-23b9b139e56eb92eff330f4569ae02c616837a5c62585a22f837a622ecf60a06 WatchSource:0}: Error finding container 23b9b139e56eb92eff330f4569ae02c616837a5c62585a22f837a622ecf60a06: Status 404 returned error can't find the container with id 23b9b139e56eb92eff330f4569ae02c616837a5c62585a22f837a622ecf60a06 Apr 24 19:07:08.158358 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:08.158342 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f268c01941fb4d77b80d374b8add70b.slice/crio-7f1d7eaedf5f30ee6bf0f4ef073526736b3ed6fd9a98a8a3752b1810e9713022 WatchSource:0}: Error finding container 7f1d7eaedf5f30ee6bf0f4ef073526736b3ed6fd9a98a8a3752b1810e9713022: Status 404 returned error can't find the container with id 7f1d7eaedf5f30ee6bf0f4ef073526736b3ed6fd9a98a8a3752b1810e9713022 Apr 24 19:07:08.164130 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.164060 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:07:08.175300 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:08.175277 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:08.275859 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:08.275787 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:08.376255 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:08.376211 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-214.ec2.internal\" not found" Apr 24 19:07:08.441052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.441024 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:08.476200 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.476157 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" Apr 24 19:07:08.488466 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.488434 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:07:08.489435 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.489420 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" Apr 24 19:07:08.498989 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.498958 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:07:08.952208 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.952173 2573 apiserver.go:52] "Watching apiserver" Apr 24 19:07:08.960262 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.960233 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 19:07:08.961586 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.961555 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bk6r2","openshift-image-registry/node-ca-2p82x","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal","openshift-multus/multus-4zr6t","openshift-multus/multus-additional-cni-plugins-zhc28","openshift-multus/network-metrics-daemon-cr4ls","kube-system/konnectivity-agent-lgp86","openshift-cluster-node-tuning-operator/tuned-pr4b4","openshift-network-diagnostics/network-check-target-r98mn","openshift-network-operator/iptables-alerter-w2hgp","openshift-ovn-kubernetes/ovnkube-node-k6tsb","kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf"] Apr 24 19:07:08.964325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.964206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:08.964534 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.964515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:08.967366 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.967338 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.968239 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.966659 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.968708 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.968364 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 19:07:08.968708 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.968558 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 19:07:08.968708 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.968621 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-52zwb\"" Apr 24 19:07:08.968905 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.968781 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 19:07:08.968905 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.968842 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 19:07:08.968905 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.968784 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mzd8w\"" Apr 24 19:07:08.969122 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.969102 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 19:07:08.969827 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.969807 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 19:07:08.969915 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.969836 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 19:07:08.970104 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.970087 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 19:07:08.970591 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.970570 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cpffq\"" Apr 24 19:07:08.970704 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.970630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:08.970765 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:08.970705 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:08.971733 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.971453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 19:07:08.971733 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.971670 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 19:07:08.973006 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.972989 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:08.973534 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.973507 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 19:07:08.973944 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.973924 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xcndp\"" Apr 24 19:07:08.975840 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.975813 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 19:07:08.975981 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.975964 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 19:07:08.976070 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.976051 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jf8c5\"" Apr 24 19:07:08.976814 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.976553 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.977787 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.977767 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:08.977879 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:08.977832 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:08.977982 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.977963 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:08.978634 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.978594 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:08.978745 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.978728 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xkz9p\"" Apr 24 19:07:08.979044 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.979024 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:08.979385 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.979363 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:08.980841 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.980822 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:08.983058 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.983039 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 19:07:08.983231 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.983217 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 19:07:08.983342 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.983328 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 19:07:08.983503 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.983489 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:08.984069 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.984050 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 19:07:08.984161 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.984118 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sd4tx\"" Apr 24 19:07:08.984212 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.984197 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 19:07:08.984267 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.984218 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 19:07:08.984267 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.984241 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 19:07:08.984267 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.984264 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pgvc8\"" Apr 24 19:07:08.984428 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.984411 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 19:07:08.984879 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.984861 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 19:07:08.985065 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.985049 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 19:07:08.985402 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.985385 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lgxvb\"" Apr 24 19:07:08.985597 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.985570 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 19:07:08.989001 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.988979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-k8s-cni-cncf-io\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989083 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-kubelet\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989083 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-etc-kubernetes\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989083 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xgr8\" (UniqueName: \"kubernetes.io/projected/8a81da49-19b8-407f-a961-d85a0ec045e1-kube-api-access-4xgr8\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:08.989241 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989095 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-os-release\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.989241 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ffgz\" (UniqueName: \"kubernetes.io/projected/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-kube-api-access-6ffgz\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.989241 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989145 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/79c1d341-2bed-41fe-b49c-3f1de4604feb-hosts-file\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:08.989241 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-lib-modules\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.989241 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989192 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-netns\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989241 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-conf-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/910be432-3c6b-4796-b69f-ec249fce39e9-multus-daemon-config\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989293 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-multus-certs\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-system-cni-dir\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989391 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-cni-multus\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989423 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-hostroot\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-host\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:08.989516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysconfig\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989525 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-os-release\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks96q\" (UniqueName: \"kubernetes.io/projected/910be432-3c6b-4796-b69f-ec249fce39e9-kube-api-access-ks96q\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb7f420a-f722-43ae-b31f-4de4b069fe5c-konnectivity-ca\") pod \"konnectivity-agent-lgp86\" (UID: \"bb7f420a-f722-43ae-b31f-4de4b069fe5c\") " pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989705 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svx9l\" (UniqueName: \"kubernetes.io/projected/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-kube-api-access-svx9l\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysctl-conf\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-host\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73d34880-7e47-4c3a-869e-7a929e328a13-tmp\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-cni-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989837 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-socket-dir-parent\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb7f420a-f722-43ae-b31f-4de4b069fe5c-agent-certs\") pod \"konnectivity-agent-lgp86\" (UID: \"bb7f420a-f722-43ae-b31f-4de4b069fe5c\") " pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-var-lib-kubelet\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hrm7\" (UniqueName: \"kubernetes.io/projected/decb2c4f-4ec7-44bc-abba-863c45b63162-kube-api-access-4hrm7\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989934 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-cnibin\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.989955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cnibin\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.989982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990017 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79c1d341-2bed-41fe-b49c-3f1de4604feb-tmp-dir\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysctl-d\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-run\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-sys\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-cni-bin\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-serviceca\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvpzf\" (UniqueName: \"kubernetes.io/projected/79c1d341-2bed-41fe-b49c-3f1de4604feb-kube-api-access-cvpzf\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-modprobe-d\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990272 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-kubernetes\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990329 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-systemd\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-system-cni-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/910be432-3c6b-4796-b69f-ec249fce39e9-cni-binary-copy\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:08.990661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73d34880-7e47-4c3a-869e-7a929e328a13-etc-tuned\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.991478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dd4\" (UniqueName: \"kubernetes.io/projected/73d34880-7e47-4c3a-869e-7a929e328a13-kube-api-access-w8dd4\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:08.991478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/decb2c4f-4ec7-44bc-abba-863c45b63162-iptables-alerter-script\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:08.991478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:08.990561 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/decb2c4f-4ec7-44bc-abba-863c45b63162-host-slash\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:09.021848 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.021192 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:02:08 +0000 UTC" deadline="2028-01-21 10:24:48.042593116 +0000 UTC" Apr 24 19:07:09.021848 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.021220 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15279h17m39.021376637s" Apr 24 19:07:09.078462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.078431 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 19:07:09.091318 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvpzf\" (UniqueName: \"kubernetes.io/projected/79c1d341-2bed-41fe-b49c-3f1de4604feb-kube-api-access-cvpzf\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:09.091478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-systemd\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.091478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-kubelet\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.091478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-cni-netd\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.091478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091414 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovn-node-metrics-cert\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.091478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dd4\" (UniqueName: \"kubernetes.io/projected/73d34880-7e47-4c3a-869e-7a929e328a13-kube-api-access-w8dd4\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.091749 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/decb2c4f-4ec7-44bc-abba-863c45b63162-iptables-alerter-script\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:09.091749 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/decb2c4f-4ec7-44bc-abba-863c45b63162-host-slash\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:09.091749 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/decb2c4f-4ec7-44bc-abba-863c45b63162-host-slash\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:09.091749 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-slash\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.091749 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091694 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-systemd\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xgr8\" (UniqueName: \"kubernetes.io/projected/8a81da49-19b8-407f-a961-d85a0ec045e1-kube-api-access-4xgr8\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-lib-modules\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091832 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-netns\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091857 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/910be432-3c6b-4796-b69f-ec249fce39e9-multus-daemon-config\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-multus-certs\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-system-cni-dir\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-etc-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091938 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-netns\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.091984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.091975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glnlf\" (UniqueName: \"kubernetes.io/projected/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-kube-api-access-glnlf\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-multus-certs\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092052 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-lib-modules\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092074 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-system-cni-dir\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092080 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-device-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-cni-multus\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092167 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-cni-multus\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/decb2c4f-4ec7-44bc-abba-863c45b63162-iptables-alerter-script\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-host\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-node-log\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-host\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092276 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-etc-selinux\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-os-release\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svx9l\" (UniqueName: \"kubernetes.io/projected/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-kube-api-access-svx9l\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysctl-conf\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092361 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-run-netns\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-run-ovn-kubernetes\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.092414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-os-release\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092414 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovnkube-config\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-socket-dir-parent\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb7f420a-f722-43ae-b31f-4de4b069fe5c-agent-certs\") pod \"konnectivity-agent-lgp86\" (UID: \"bb7f420a-f722-43ae-b31f-4de4b069fe5c\") " pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092505 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-cnibin\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cnibin\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092552 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79c1d341-2bed-41fe-b49c-3f1de4604feb-tmp-dir\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092557 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysctl-conf\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/910be432-3c6b-4796-b69f-ec249fce39e9-multus-daemon-config\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysctl-d\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-run\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-log-socket\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-cni-bin\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysctl-d\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-serviceca\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-modprobe-d\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092830 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-run\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092841 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-socket-dir-parent\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093142 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-cnibin\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-kubernetes\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-ovn\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.092981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovnkube-script-lib\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-system-cni-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/910be432-3c6b-4796-b69f-ec249fce39e9-cni-binary-copy\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73d34880-7e47-4c3a-869e-7a929e328a13-etc-tuned\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093107 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-sys-fs\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-k8s-cni-cncf-io\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-kubelet\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-etc-kubernetes\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-os-release\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-serviceca\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093237 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ffgz\" (UniqueName: \"kubernetes.io/projected/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-kube-api-access-6ffgz\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093257 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-kubernetes\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/79c1d341-2bed-41fe-b49c-3f1de4604feb-hosts-file\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:09.093873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093321 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/79c1d341-2bed-41fe-b49c-3f1de4604feb-hosts-file\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-host\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73d34880-7e47-4c3a-869e-7a929e328a13-tmp\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-conf-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-etc-kubernetes\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-cni-bin\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-os-release\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093536 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-registration-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-hostroot\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysconfig\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093630 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-systemd-units\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks96q\" (UniqueName: \"kubernetes.io/projected/910be432-3c6b-4796-b69f-ec249fce39e9-kube-api-access-ks96q\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093682 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:09.094703 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb7f420a-f722-43ae-b31f-4de4b069fe5c-konnectivity-ca\") pod \"konnectivity-agent-lgp86\" (UID: \"bb7f420a-f722-43ae-b31f-4de4b069fe5c\") " pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093734 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hrm7\" (UniqueName: \"kubernetes.io/projected/decb2c4f-4ec7-44bc-abba-863c45b63162-kube-api-access-4hrm7\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82lvc\" (UniqueName: \"kubernetes.io/projected/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-kube-api-access-82lvc\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-cni-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-var-lib-kubelet\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-var-lib-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-socket-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093938 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/910be432-3c6b-4796-b69f-ec249fce39e9-cni-binary-copy\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-sys\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-systemd\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-env-overrides\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79c1d341-2bed-41fe-b49c-3f1de4604feb-tmp-dir\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094141 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-host\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cnibin\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-cni-bin\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.095439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-system-cni-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-conf-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-run-k8s-cni-cncf-io\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.094801 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.094889 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:09.594859313 +0000 UTC m=+3.065018821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.094807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-var-lib-kubelet\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.093403 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-host-var-lib-kubelet\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-multus-cni-dir\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/910be432-3c6b-4796-b69f-ec249fce39e9-hostroot\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095171 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-sysconfig\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095246 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-sys\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095531 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095587 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73d34880-7e47-4c3a-869e-7a929e328a13-etc-modprobe-d\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.096191 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.095969 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb7f420a-f722-43ae-b31f-4de4b069fe5c-konnectivity-ca\") pod \"konnectivity-agent-lgp86\" (UID: \"bb7f420a-f722-43ae-b31f-4de4b069fe5c\") " pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:09.097868 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.097842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73d34880-7e47-4c3a-869e-7a929e328a13-tmp\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.097868 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.097857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73d34880-7e47-4c3a-869e-7a929e328a13-etc-tuned\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.098884 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.098835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb7f420a-f722-43ae-b31f-4de4b069fe5c-agent-certs\") pod \"konnectivity-agent-lgp86\" (UID: \"bb7f420a-f722-43ae-b31f-4de4b069fe5c\") " pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:09.119790 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.119721 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" event={"ID":"9f268c01941fb4d77b80d374b8add70b","Type":"ContainerStarted","Data":"7f1d7eaedf5f30ee6bf0f4ef073526736b3ed6fd9a98a8a3752b1810e9713022"} Apr 24 19:07:09.121325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.121291 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" event={"ID":"3f5616a56b4d82e03f666685a7c47e3f","Type":"ContainerStarted","Data":"23b9b139e56eb92eff330f4569ae02c616837a5c62585a22f837a622ecf60a06"} Apr 24 19:07:09.130370 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.130343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svx9l\" (UniqueName: \"kubernetes.io/projected/9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92-kube-api-access-svx9l\") pod \"node-ca-2p82x\" (UID: \"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92\") " pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:09.134128 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.134101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xgr8\" (UniqueName: \"kubernetes.io/projected/8a81da49-19b8-407f-a961-d85a0ec045e1-kube-api-access-4xgr8\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:09.136267 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.136244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hrm7\" (UniqueName: \"kubernetes.io/projected/decb2c4f-4ec7-44bc-abba-863c45b63162-kube-api-access-4hrm7\") pod \"iptables-alerter-w2hgp\" (UID: \"decb2c4f-4ec7-44bc-abba-863c45b63162\") " pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:09.137276 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.137176 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:09.137276 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.137242 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:09.137276 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.137258 2573 projected.go:194] Error preparing data for projected volume kube-api-access-p88zd for pod openshift-network-diagnostics/network-check-target-r98mn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:09.137462 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.137346 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd podName:fd984433-e82d-4be9-964f-829123c5bb26 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:09.63732636 +0000 UTC m=+3.107485876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p88zd" (UniqueName: "kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd") pod "network-check-target-r98mn" (UID: "fd984433-e82d-4be9-964f-829123c5bb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:09.139535 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.139515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ffgz\" (UniqueName: \"kubernetes.io/projected/6f5ee354-6f03-4d9b-8ef6-57c8988d266c-kube-api-access-6ffgz\") pod \"multus-additional-cni-plugins-zhc28\" (UID: \"6f5ee354-6f03-4d9b-8ef6-57c8988d266c\") " pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.140511 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.140469 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvpzf\" (UniqueName: \"kubernetes.io/projected/79c1d341-2bed-41fe-b49c-3f1de4604feb-kube-api-access-cvpzf\") pod \"node-resolver-bk6r2\" (UID: \"79c1d341-2bed-41fe-b49c-3f1de4604feb\") " pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:09.150308 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.150279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dd4\" (UniqueName: \"kubernetes.io/projected/73d34880-7e47-4c3a-869e-7a929e328a13-kube-api-access-w8dd4\") pod \"tuned-pr4b4\" (UID: \"73d34880-7e47-4c3a-869e-7a929e328a13\") " pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.156155 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.156130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks96q\" (UniqueName: \"kubernetes.io/projected/910be432-3c6b-4796-b69f-ec249fce39e9-kube-api-access-ks96q\") pod \"multus-4zr6t\" (UID: \"910be432-3c6b-4796-b69f-ec249fce39e9\") " pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.194820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-kubelet\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.194820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-cni-netd\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovn-node-metrics-cert\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-kubelet\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-slash\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194905 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-cni-netd\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-slash\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-etc-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glnlf\" (UniqueName: \"kubernetes.io/projected/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-kube-api-access-glnlf\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.194996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-device-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.195052 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-etc-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-node-log\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-etc-selinux\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195137 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-node-log\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-device-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-run-netns\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195179 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-run-netns\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-etc-selinux\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-run-ovn-kubernetes\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovnkube-config\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195238 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-run-ovn-kubernetes\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-log-socket\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-ovn\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovnkube-script-lib\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-log-socket\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195355 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-ovn\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.195462 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195399 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-sys-fs\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-sys-fs\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195470 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-cni-bin\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-registration-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-systemd-units\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-cni-bin\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195593 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82lvc\" (UniqueName: \"kubernetes.io/projected/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-kube-api-access-82lvc\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-var-lib-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195649 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-registration-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195682 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-socket-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-systemd-units\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-systemd\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-env-overrides\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196809 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195867 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-socket-dir\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.196809 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovnkube-script-lib\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196809 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-var-lib-openvswitch\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196809 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.195948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-run-systemd\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196809 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.196124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-env-overrides\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.196809 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.196407 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovnkube-config\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.197792 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.197773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-ovn-node-metrics-cert\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.206492 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.206425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82lvc\" (UniqueName: \"kubernetes.io/projected/71cdc1ff-6a00-416c-bb1d-5c2ae283909d-kube-api-access-82lvc\") pod \"aws-ebs-csi-driver-node-l5fcf\" (UID: \"71cdc1ff-6a00-416c-bb1d-5c2ae283909d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.206492 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.206467 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glnlf\" (UniqueName: \"kubernetes.io/projected/d25f07b3-9bc3-4e49-ad47-406fe9d7e1da-kube-api-access-glnlf\") pod \"ovnkube-node-k6tsb\" (UID: \"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da\") " pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.223498 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.223464 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:09.278623 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.278578 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:09.286390 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.286363 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2p82x" Apr 24 19:07:09.296902 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.296875 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zhc28" Apr 24 19:07:09.302459 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.302436 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4zr6t" Apr 24 19:07:09.309993 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.309974 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bk6r2" Apr 24 19:07:09.318529 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.318502 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" Apr 24 19:07:09.326010 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.325983 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w2hgp" Apr 24 19:07:09.334653 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.334636 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:09.346237 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.346217 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" Apr 24 19:07:09.599146 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.599056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:09.599324 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.599225 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:09.599324 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.599298 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:10.599279019 +0000 UTC m=+4.069438524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:09.700006 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:09.699975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:09.700184 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.700112 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:09.700184 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.700133 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:09.700184 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.700143 2573 projected.go:194] Error preparing data for projected volume kube-api-access-p88zd for pod openshift-network-diagnostics/network-check-target-r98mn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:09.700350 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:09.700199 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd podName:fd984433-e82d-4be9-964f-829123c5bb26 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:10.700185416 +0000 UTC m=+4.170344925 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p88zd" (UniqueName: "kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd") pod "network-check-target-r98mn" (UID: "fd984433-e82d-4be9-964f-829123c5bb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:09.886453 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.886425 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d34880_7e47_4c3a_869e_7a929e328a13.slice/crio-764144203c2f9b05bddd4821f6ffe9d29550ed71722570bd8faa66bc0a4d13c3 WatchSource:0}: Error finding container 764144203c2f9b05bddd4821f6ffe9d29550ed71722570bd8faa66bc0a4d13c3: Status 404 returned error can't find the container with id 764144203c2f9b05bddd4821f6ffe9d29550ed71722570bd8faa66bc0a4d13c3 Apr 24 19:07:09.888543 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.888502 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb7f420a_f722_43ae_b31f_4de4b069fe5c.slice/crio-8a000395978af17cd3c906ee4e45f0250929fa207555476e92475aa9d6f1ad1d WatchSource:0}: Error finding container 8a000395978af17cd3c906ee4e45f0250929fa207555476e92475aa9d6f1ad1d: Status 404 returned error can't find the container with id 8a000395978af17cd3c906ee4e45f0250929fa207555476e92475aa9d6f1ad1d Apr 24 19:07:09.894965 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.894945 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f5ee354_6f03_4d9b_8ef6_57c8988d266c.slice/crio-976625eaeb20ab9378b11900a8254f3a4f9102ffa5592e99590580b92021fb2f WatchSource:0}: Error finding container 976625eaeb20ab9378b11900a8254f3a4f9102ffa5592e99590580b92021fb2f: Status 404 returned error can't find the container with id 976625eaeb20ab9378b11900a8254f3a4f9102ffa5592e99590580b92021fb2f Apr 24 19:07:09.895828 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.895787 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd25f07b3_9bc3_4e49_ad47_406fe9d7e1da.slice/crio-ba069684829065b1f03fde630332d472ed99607d6190c748d09239790720b8e3 WatchSource:0}: Error finding container ba069684829065b1f03fde630332d472ed99607d6190c748d09239790720b8e3: Status 404 returned error can't find the container with id ba069684829065b1f03fde630332d472ed99607d6190c748d09239790720b8e3 Apr 24 19:07:09.896676 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.896656 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c1d341_2bed_41fe_b49c_3f1de4604feb.slice/crio-dae0f787e7f58694d37e6bf54fb3d89765d22d518ed62157c96ce45422d55337 WatchSource:0}: Error finding container dae0f787e7f58694d37e6bf54fb3d89765d22d518ed62157c96ce45422d55337: Status 404 returned error can't find the container with id dae0f787e7f58694d37e6bf54fb3d89765d22d518ed62157c96ce45422d55337 Apr 24 19:07:09.897642 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.897622 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod910be432_3c6b_4796_b69f_ec249fce39e9.slice/crio-34c08e112f0b5f19d2482470bacfa9a9a178cfb98e7a0b5fd35bbbc010e6a549 WatchSource:0}: Error finding container 34c08e112f0b5f19d2482470bacfa9a9a178cfb98e7a0b5fd35bbbc010e6a549: Status 404 returned error can't find the container with id 34c08e112f0b5f19d2482470bacfa9a9a178cfb98e7a0b5fd35bbbc010e6a549 Apr 24 19:07:09.899047 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.898964 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e31045f_47cd_4c8f_bb8b_a6a36b6bdb92.slice/crio-5e19651ca7ade59489e54a4a6695081420cfcdf0724d5e9ff7b7bb2c07d6550e WatchSource:0}: Error finding container 5e19651ca7ade59489e54a4a6695081420cfcdf0724d5e9ff7b7bb2c07d6550e: Status 404 returned error can't find the container with id 5e19651ca7ade59489e54a4a6695081420cfcdf0724d5e9ff7b7bb2c07d6550e Apr 24 19:07:09.900211 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.900192 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71cdc1ff_6a00_416c_bb1d_5c2ae283909d.slice/crio-b7be8bc3015a3ca1853cd17eb24dbbd0515da013d28491e2a7698917f7a5463c WatchSource:0}: Error finding container b7be8bc3015a3ca1853cd17eb24dbbd0515da013d28491e2a7698917f7a5463c: Status 404 returned error can't find the container with id b7be8bc3015a3ca1853cd17eb24dbbd0515da013d28491e2a7698917f7a5463c Apr 24 19:07:09.901242 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:09.901217 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddecb2c4f_4ec7_44bc_abba_863c45b63162.slice/crio-61106b1cab6e14a61b8834b08004c81c5c4ceb06a38de7bf48eebf35e5c213ec WatchSource:0}: Error finding container 61106b1cab6e14a61b8834b08004c81c5c4ceb06a38de7bf48eebf35e5c213ec: Status 404 returned error can't find the container with id 61106b1cab6e14a61b8834b08004c81c5c4ceb06a38de7bf48eebf35e5c213ec Apr 24 19:07:10.021576 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.021539 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:02:08 +0000 UTC" deadline="2028-02-07 11:16:45.690414884 +0000 UTC" Apr 24 19:07:10.021576 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.021566 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15688h9m35.668851328s" Apr 24 19:07:10.124550 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.124511 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" event={"ID":"71cdc1ff-6a00-416c-bb1d-5c2ae283909d","Type":"ContainerStarted","Data":"b7be8bc3015a3ca1853cd17eb24dbbd0515da013d28491e2a7698917f7a5463c"} Apr 24 19:07:10.125717 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.125690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w2hgp" event={"ID":"decb2c4f-4ec7-44bc-abba-863c45b63162","Type":"ContainerStarted","Data":"61106b1cab6e14a61b8834b08004c81c5c4ceb06a38de7bf48eebf35e5c213ec"} Apr 24 19:07:10.126645 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.126544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2p82x" event={"ID":"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92","Type":"ContainerStarted","Data":"5e19651ca7ade59489e54a4a6695081420cfcdf0724d5e9ff7b7bb2c07d6550e"} Apr 24 19:07:10.128230 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.128210 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bk6r2" event={"ID":"79c1d341-2bed-41fe-b49c-3f1de4604feb","Type":"ContainerStarted","Data":"dae0f787e7f58694d37e6bf54fb3d89765d22d518ed62157c96ce45422d55337"} Apr 24 19:07:10.129146 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.129128 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhc28" event={"ID":"6f5ee354-6f03-4d9b-8ef6-57c8988d266c","Type":"ContainerStarted","Data":"976625eaeb20ab9378b11900a8254f3a4f9102ffa5592e99590580b92021fb2f"} Apr 24 19:07:10.130082 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.130059 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"ba069684829065b1f03fde630332d472ed99607d6190c748d09239790720b8e3"} Apr 24 19:07:10.130988 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.130962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" event={"ID":"73d34880-7e47-4c3a-869e-7a929e328a13","Type":"ContainerStarted","Data":"764144203c2f9b05bddd4821f6ffe9d29550ed71722570bd8faa66bc0a4d13c3"} Apr 24 19:07:10.132374 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.132354 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" event={"ID":"3f5616a56b4d82e03f666685a7c47e3f","Type":"ContainerStarted","Data":"4c1ce3744a033dcbe77621b66af88b27cb6df7279f96a7ccf475d9967d6a248f"} Apr 24 19:07:10.133863 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.133840 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zr6t" event={"ID":"910be432-3c6b-4796-b69f-ec249fce39e9","Type":"ContainerStarted","Data":"34c08e112f0b5f19d2482470bacfa9a9a178cfb98e7a0b5fd35bbbc010e6a549"} Apr 24 19:07:10.135254 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.135226 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lgp86" event={"ID":"bb7f420a-f722-43ae-b31f-4de4b069fe5c","Type":"ContainerStarted","Data":"8a000395978af17cd3c906ee4e45f0250929fa207555476e92475aa9d6f1ad1d"} Apr 24 19:07:10.146938 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.146903 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-214.ec2.internal" podStartSLOduration=2.1468913880000002 podStartE2EDuration="2.146891388s" podCreationTimestamp="2026-04-24 19:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:10.146240996 +0000 UTC m=+3.616400521" watchObservedRunningTime="2026-04-24 19:07:10.146891388 +0000 UTC m=+3.617050925" Apr 24 19:07:10.334066 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.334036 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:07:10.606562 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.606523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:10.606758 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:10.606739 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:10.606841 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:10.606816 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.606797494 +0000 UTC m=+6.076957004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:10.707441 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:10.707342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:10.707623 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:10.707508 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:10.707623 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:10.707527 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:10.707623 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:10.707539 2573 projected.go:194] Error preparing data for projected volume kube-api-access-p88zd for pod openshift-network-diagnostics/network-check-target-r98mn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:10.707623 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:10.707598 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd podName:fd984433-e82d-4be9-964f-829123c5bb26 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.707581145 +0000 UTC m=+6.177740658 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-p88zd" (UniqueName: "kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd") pod "network-check-target-r98mn" (UID: "fd984433-e82d-4be9-964f-829123c5bb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:11.115968 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:11.115934 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:11.116396 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:11.115989 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:11.116396 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:11.116057 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:11.116396 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:11.116187 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:11.147676 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:11.147522 2573 generic.go:358] "Generic (PLEG): container finished" podID="9f268c01941fb4d77b80d374b8add70b" containerID="d50318175a27806bbb37d7c0de165ae801af3703fa71c7611044135d55de6802" exitCode=0 Apr 24 19:07:11.147823 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:11.147678 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" event={"ID":"9f268c01941fb4d77b80d374b8add70b","Type":"ContainerDied","Data":"d50318175a27806bbb37d7c0de165ae801af3703fa71c7611044135d55de6802"} Apr 24 19:07:12.169448 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:12.169347 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" event={"ID":"9f268c01941fb4d77b80d374b8add70b","Type":"ContainerStarted","Data":"e4d29d3ea62bc731ecc4c38a847aa2e5d18c46616706145bad1d7e81710b28c2"} Apr 24 19:07:12.623693 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:12.623067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:12.623693 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:12.623217 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:12.623693 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:12.623277 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:16.623259289 +0000 UTC m=+10.093418798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:12.724715 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:12.724042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:12.724715 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:12.724255 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:12.724715 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:12.724278 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:12.724715 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:12.724291 2573 projected.go:194] Error preparing data for projected volume kube-api-access-p88zd for pod openshift-network-diagnostics/network-check-target-r98mn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:12.724715 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:12.724350 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd podName:fd984433-e82d-4be9-964f-829123c5bb26 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:16.724331922 +0000 UTC m=+10.194491437 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-p88zd" (UniqueName: "kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd") pod "network-check-target-r98mn" (UID: "fd984433-e82d-4be9-964f-829123c5bb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:13.116054 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:13.115952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:13.116204 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:13.116134 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:13.116586 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:13.116565 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:13.116747 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:13.116701 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:15.116167 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:15.115676 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:15.116167 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:15.115863 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:15.116167 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:15.115897 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:15.116167 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:15.116021 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:16.173258 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.173199 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-214.ec2.internal" podStartSLOduration=8.173179275 podStartE2EDuration="8.173179275s" podCreationTimestamp="2026-04-24 19:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:12.184836618 +0000 UTC m=+5.654996149" watchObservedRunningTime="2026-04-24 19:07:16.173179275 +0000 UTC m=+9.643338806" Apr 24 19:07:16.173743 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.173441 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xf9r2"] Apr 24 19:07:16.176291 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.175991 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.176291 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.176070 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:16.251931 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.251893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.252100 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.251968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6f8ef441-4de5-492e-9e4a-1c61639cde69-kubelet-config\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.252100 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.251995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6f8ef441-4de5-492e-9e4a-1c61639cde69-dbus\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.353179 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.353142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6f8ef441-4de5-492e-9e4a-1c61639cde69-kubelet-config\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.353345 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.353190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6f8ef441-4de5-492e-9e4a-1c61639cde69-dbus\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.353345 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.353268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.353345 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.353288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6f8ef441-4de5-492e-9e4a-1c61639cde69-kubelet-config\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.353520 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.353387 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:16.353520 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.353389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6f8ef441-4de5-492e-9e4a-1c61639cde69-dbus\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.353520 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.353446 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret podName:6f8ef441-4de5-492e-9e4a-1c61639cde69 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:16.853428328 +0000 UTC m=+10.323587839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret") pod "global-pull-secret-syncer-xf9r2" (UID: "6f8ef441-4de5-492e-9e4a-1c61639cde69") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:16.657118 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.656405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:16.657118 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.656679 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:16.657118 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.656742 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:24.656724226 +0000 UTC m=+18.126883734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:16.756963 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.756900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:16.757177 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.757112 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:16.757177 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.757138 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:16.757177 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.757154 2573 projected.go:194] Error preparing data for projected volume kube-api-access-p88zd for pod openshift-network-diagnostics/network-check-target-r98mn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:16.757328 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.757210 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd podName:fd984433-e82d-4be9-964f-829123c5bb26 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:24.757194228 +0000 UTC m=+18.227353739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-p88zd" (UniqueName: "kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd") pod "network-check-target-r98mn" (UID: "fd984433-e82d-4be9-964f-829123c5bb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:16.858103 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:16.858068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:16.858314 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.858248 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:16.858380 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:16.858321 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret podName:6f8ef441-4de5-492e-9e4a-1c61639cde69 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:17.858300174 +0000 UTC m=+11.328459682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret") pod "global-pull-secret-syncer-xf9r2" (UID: "6f8ef441-4de5-492e-9e4a-1c61639cde69") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:17.116704 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:17.116676 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:17.116834 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:17.116787 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:17.116921 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:17.116902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:17.117024 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:17.117004 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:17.867597 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:17.867555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:17.868149 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:17.867705 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:17.868149 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:17.867765 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret podName:6f8ef441-4de5-492e-9e4a-1c61639cde69 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:19.867746086 +0000 UTC m=+13.337905598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret") pod "global-pull-secret-syncer-xf9r2" (UID: "6f8ef441-4de5-492e-9e4a-1c61639cde69") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:18.115869 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:18.115838 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:18.116031 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:18.115967 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:19.115532 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:19.115338 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:19.115532 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:19.115473 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:19.116054 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:19.115531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:19.116054 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:19.115665 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:19.883186 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:19.883152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:19.883350 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:19.883321 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:19.883424 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:19.883395 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret podName:6f8ef441-4de5-492e-9e4a-1c61639cde69 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:23.883374788 +0000 UTC m=+17.353534297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret") pod "global-pull-secret-syncer-xf9r2" (UID: "6f8ef441-4de5-492e-9e4a-1c61639cde69") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:20.115843 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:20.115812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:20.116314 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:20.115929 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:21.115394 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:21.115356 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:21.115558 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:21.115397 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:21.115558 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:21.115496 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:21.115687 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:21.115628 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:22.115888 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:22.115839 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:22.116303 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:22.115958 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:23.115520 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:23.115483 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:23.115707 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:23.115491 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:23.115707 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:23.115627 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:23.115825 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:23.115716 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:23.914830 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:23.914788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:23.915227 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:23.914951 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:23.915227 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:23.915025 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret podName:6f8ef441-4de5-492e-9e4a-1c61639cde69 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:31.915006702 +0000 UTC m=+25.385166212 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret") pod "global-pull-secret-syncer-xf9r2" (UID: "6f8ef441-4de5-492e-9e4a-1c61639cde69") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:24.115752 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:24.115693 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:24.115916 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:24.115818 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:24.720164 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:24.720121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:24.720333 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:24.720296 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:24.720405 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:24.720366 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:40.720350037 +0000 UTC m=+34.190509547 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:24.821259 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:24.821223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:24.821421 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:24.821385 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:24.821421 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:24.821408 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:24.821421 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:24.821421 2573 projected.go:194] Error preparing data for projected volume kube-api-access-p88zd for pod openshift-network-diagnostics/network-check-target-r98mn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:24.821530 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:24.821480 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd podName:fd984433-e82d-4be9-964f-829123c5bb26 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:40.821466531 +0000 UTC m=+34.291626038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-p88zd" (UniqueName: "kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd") pod "network-check-target-r98mn" (UID: "fd984433-e82d-4be9-964f-829123c5bb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:25.115260 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:25.115135 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:25.115728 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:25.115270 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:25.115728 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:25.115328 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:25.115728 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:25.115442 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:26.115214 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:26.115178 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:26.115401 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:26.115301 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:27.117433 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.117026 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:27.118046 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.117084 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:27.118046 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:27.117563 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:27.118046 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:27.117649 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:27.195665 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.195639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" event={"ID":"71cdc1ff-6a00-416c-bb1d-5c2ae283909d","Type":"ContainerStarted","Data":"3d1679f68dfc673ddfbd669c16b17af642b2ca293abac99bc78df3a3e3c74a85"} Apr 24 19:07:27.196709 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.196689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2p82x" event={"ID":"9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92","Type":"ContainerStarted","Data":"ecc9da48a80285a6e65197430f284fb1ccc0a33e4cedc6c8fcb828c0a78bcef6"} Apr 24 19:07:27.197787 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.197759 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bk6r2" event={"ID":"79c1d341-2bed-41fe-b49c-3f1de4604feb","Type":"ContainerStarted","Data":"c56e4fc1dacd54948d7f9f6788452d527bbd514efaabe4d9bf946f3952fa26fa"} Apr 24 19:07:27.198988 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.198965 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5ee354-6f03-4d9b-8ef6-57c8988d266c" containerID="ed14f1cf0352d7f9d2e1df4ca8885851505bc5e5850084f4247a6b97cbb2f661" exitCode=0 Apr 24 19:07:27.199070 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.199024 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhc28" event={"ID":"6f5ee354-6f03-4d9b-8ef6-57c8988d266c","Type":"ContainerDied","Data":"ed14f1cf0352d7f9d2e1df4ca8885851505bc5e5850084f4247a6b97cbb2f661"} Apr 24 19:07:27.202148 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.202128 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:07:27.202590 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.202568 2573 generic.go:358] "Generic (PLEG): container finished" podID="d25f07b3-9bc3-4e49-ad47-406fe9d7e1da" containerID="394f3820422bdf69a9ee046b61b815193e363f9e3d02c2c29548785de3b0b805" exitCode=1 Apr 24 19:07:27.202708 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.202639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"7677531ffbce39fd9acbfc88b490e37ff2088c16b7db0791fd13936de0e7b333"} Apr 24 19:07:27.202708 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.202664 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerDied","Data":"394f3820422bdf69a9ee046b61b815193e363f9e3d02c2c29548785de3b0b805"} Apr 24 19:07:27.202708 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.202675 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"d16a528cc59f59fa36c13574525ae5efea58820a2ae804aa737cae8ce2cadd1f"} Apr 24 19:07:27.204029 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.203997 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" event={"ID":"73d34880-7e47-4c3a-869e-7a929e328a13","Type":"ContainerStarted","Data":"e9a583adfc69a04dc58d9e494498cd4311dd7eae10ad23d7c8868c29ef7ca673"} Apr 24 19:07:27.205124 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.205105 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zr6t" event={"ID":"910be432-3c6b-4796-b69f-ec249fce39e9","Type":"ContainerStarted","Data":"6b498be797005925b668ffe330ced4b7759a169e0229345b7e17feb43d5cdbbe"} Apr 24 19:07:27.206271 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.206253 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lgp86" event={"ID":"bb7f420a-f722-43ae-b31f-4de4b069fe5c","Type":"ContainerStarted","Data":"ebf9c0a24e28b02ba36fe03cf788d7c24c4b681ff1fba68b2aa8f36cf1fab23e"} Apr 24 19:07:27.237172 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.235678 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lgp86" podStartSLOduration=11.347698423 podStartE2EDuration="20.235657998s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.890489929 +0000 UTC m=+3.360649436" lastFinishedPulling="2026-04-24 19:07:18.778449493 +0000 UTC m=+12.248609011" observedRunningTime="2026-04-24 19:07:27.235205805 +0000 UTC m=+20.705365335" watchObservedRunningTime="2026-04-24 19:07:27.235657998 +0000 UTC m=+20.705817609" Apr 24 19:07:27.249620 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.249522 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2p82x" podStartSLOduration=3.406023084 podStartE2EDuration="20.249508867s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.901166124 +0000 UTC m=+3.371325636" lastFinishedPulling="2026-04-24 19:07:26.744651904 +0000 UTC m=+20.214811419" observedRunningTime="2026-04-24 19:07:27.249211407 +0000 UTC m=+20.719370947" watchObservedRunningTime="2026-04-24 19:07:27.249508867 +0000 UTC m=+20.719668395" Apr 24 19:07:27.266575 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.266533 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4zr6t" podStartSLOduration=3.398274572 podStartE2EDuration="20.266519086s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.899412362 +0000 UTC m=+3.369571867" lastFinishedPulling="2026-04-24 19:07:26.767656874 +0000 UTC m=+20.237816381" observedRunningTime="2026-04-24 19:07:27.265899847 +0000 UTC m=+20.736059377" watchObservedRunningTime="2026-04-24 19:07:27.266519086 +0000 UTC m=+20.736678613" Apr 24 19:07:27.282501 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.282456 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pr4b4" podStartSLOduration=3.424517831 podStartE2EDuration="20.282443668s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.888740501 +0000 UTC m=+3.358900008" lastFinishedPulling="2026-04-24 19:07:26.746666323 +0000 UTC m=+20.216825845" observedRunningTime="2026-04-24 19:07:27.282048002 +0000 UTC m=+20.752207531" watchObservedRunningTime="2026-04-24 19:07:27.282443668 +0000 UTC m=+20.752603195" Apr 24 19:07:27.297271 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:27.297236 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bk6r2" podStartSLOduration=3.497098231 podStartE2EDuration="20.297224628s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.898453949 +0000 UTC m=+3.368613455" lastFinishedPulling="2026-04-24 19:07:26.698580346 +0000 UTC m=+20.168739852" observedRunningTime="2026-04-24 19:07:27.297138904 +0000 UTC m=+20.767298432" watchObservedRunningTime="2026-04-24 19:07:27.297224628 +0000 UTC m=+20.767384156" Apr 24 19:07:28.049097 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.049065 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:28.049649 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.049625 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:28.115447 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.115428 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:28.115555 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:28.115529 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:28.209412 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.209372 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w2hgp" event={"ID":"decb2c4f-4ec7-44bc-abba-863c45b63162","Type":"ContainerStarted","Data":"9a4f1258e6c254bcf1bdacf8d06213d0c79686f0a7c8b68af0d4736bdc25268a"} Apr 24 19:07:28.212483 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.212461 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:07:28.212975 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.212952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"8976c823a1dcc1b9b06a1f868b9a80a9e76571eaf4b4240ba1cc6ab53c24808d"} Apr 24 19:07:28.213092 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.212982 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"69283bfe64d66b74197171ec4b4009d854615b7227a5163057a06cec840c2a9c"} Apr 24 19:07:28.213092 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.212995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"e9bf15f6f84f1d87122872dd1d625c5ece7525cf2babbc41646ddeac70e21a50"} Apr 24 19:07:28.213710 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.213691 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:28.213834 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.213737 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lgp86" Apr 24 19:07:28.224209 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.224165 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-w2hgp" podStartSLOduration=4.428948616 podStartE2EDuration="21.224153932s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.903344239 +0000 UTC m=+3.373503763" lastFinishedPulling="2026-04-24 19:07:26.698549569 +0000 UTC m=+20.168709079" observedRunningTime="2026-04-24 19:07:28.223879199 +0000 UTC m=+21.694038729" watchObservedRunningTime="2026-04-24 19:07:28.224153932 +0000 UTC m=+21.694313460" Apr 24 19:07:28.521272 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:28.521071 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 19:07:29.051540 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:29.051407 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T19:07:28.521267134Z","UUID":"0d82a096-5b78-4818-bed6-f6f07cb522b8","Handler":null,"Name":"","Endpoint":""} Apr 24 19:07:29.053364 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:29.053339 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 19:07:29.053478 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:29.053372 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 19:07:29.115348 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:29.115315 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:29.115514 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:29.115321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:29.115514 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:29.115441 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:29.115657 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:29.115534 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:29.216161 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:29.216126 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" event={"ID":"71cdc1ff-6a00-416c-bb1d-5c2ae283909d","Type":"ContainerStarted","Data":"3710e118e3194d574d0b8d7476229c19c205c53a706ac4d01d90a4e186099c6d"} Apr 24 19:07:30.115566 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:30.115473 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:30.115730 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:30.115624 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:30.221120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:30.221093 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:07:30.221788 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:30.221456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"8a871ceed5f91e2d3e845a62fa378979a4b5014bb9765b0f4f02060806ad99e2"} Apr 24 19:07:31.115796 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:31.115760 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:31.115959 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:31.115883 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:31.115959 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:31.115949 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:31.116087 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:31.116075 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:31.974237 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:31.974013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:31.974237 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:31.974149 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:31.974771 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:31.974245 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret podName:6f8ef441-4de5-492e-9e4a-1c61639cde69 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:47.974230205 +0000 UTC m=+41.444389711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret") pod "global-pull-secret-syncer-xf9r2" (UID: "6f8ef441-4de5-492e-9e4a-1c61639cde69") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:32.115204 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.115170 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:32.115362 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:32.115274 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:32.227532 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.227450 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" event={"ID":"71cdc1ff-6a00-416c-bb1d-5c2ae283909d","Type":"ContainerStarted","Data":"30fcd8ae2b7a96a14ad1496fc74e2711d639b188717a4271e8233a98a79a7225"} Apr 24 19:07:32.229009 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.228981 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5ee354-6f03-4d9b-8ef6-57c8988d266c" containerID="7ca88a1b54569c9918337556210c23889ab3c559e0db8cb502225bfeb9a7da2d" exitCode=0 Apr 24 19:07:32.229139 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.229071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhc28" event={"ID":"6f5ee354-6f03-4d9b-8ef6-57c8988d266c","Type":"ContainerDied","Data":"7ca88a1b54569c9918337556210c23889ab3c559e0db8cb502225bfeb9a7da2d"} Apr 24 19:07:32.232069 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.232054 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:07:32.232410 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.232388 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"24cf8b3b085fe7d83c14c05fb286b540f22389790a2aafc40e38b1062648b186"} Apr 24 19:07:32.232668 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.232652 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:32.232732 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.232678 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:32.232732 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.232690 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:32.232791 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.232776 2573 scope.go:117] "RemoveContainer" containerID="394f3820422bdf69a9ee046b61b815193e363f9e3d02c2c29548785de3b0b805" Apr 24 19:07:32.249648 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.249628 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:32.249723 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.249704 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:07:32.271742 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:32.271703 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l5fcf" podStartSLOduration=3.7634071650000003 podStartE2EDuration="25.271690995s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.901833775 +0000 UTC m=+3.371993288" lastFinishedPulling="2026-04-24 19:07:31.410117608 +0000 UTC m=+24.880277118" observedRunningTime="2026-04-24 19:07:32.246189496 +0000 UTC m=+25.716349025" watchObservedRunningTime="2026-04-24 19:07:32.271690995 +0000 UTC m=+25.741850523" Apr 24 19:07:33.115255 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.115221 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:33.115653 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.115228 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:33.115653 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:33.115320 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:33.115653 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:33.115409 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:33.237098 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.236895 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:07:33.237402 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.237378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" event={"ID":"d25f07b3-9bc3-4e49-ad47-406fe9d7e1da","Type":"ContainerStarted","Data":"2f6241aacd1181fbdeabc8f22bf376f59692602c5feea0c95946c3db333c811b"} Apr 24 19:07:33.267516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.267474 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" podStartSLOduration=9.356767114 podStartE2EDuration="26.267456486s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.89856584 +0000 UTC m=+3.368725347" lastFinishedPulling="2026-04-24 19:07:26.809255213 +0000 UTC m=+20.279414719" observedRunningTime="2026-04-24 19:07:33.267055471 +0000 UTC m=+26.737215010" watchObservedRunningTime="2026-04-24 19:07:33.267456486 +0000 UTC m=+26.737616014" Apr 24 19:07:33.660215 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.660180 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xf9r2"] Apr 24 19:07:33.660378 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.660332 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:33.660528 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:33.660454 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:33.660987 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.660963 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cr4ls"] Apr 24 19:07:33.661119 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.661053 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:33.661181 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:33.661158 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:33.661817 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.661793 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r98mn"] Apr 24 19:07:33.661902 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:33.661874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:33.661954 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:33.661938 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:34.240406 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:34.240371 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5ee354-6f03-4d9b-8ef6-57c8988d266c" containerID="bac3837131d928fc10bdadb028fe8649f5d45cfe5709ab0c710352201d90ce40" exitCode=0 Apr 24 19:07:34.240958 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:34.240459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhc28" event={"ID":"6f5ee354-6f03-4d9b-8ef6-57c8988d266c","Type":"ContainerDied","Data":"bac3837131d928fc10bdadb028fe8649f5d45cfe5709ab0c710352201d90ce40"} Apr 24 19:07:35.115349 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:35.115274 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:35.115489 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:35.115262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:35.115489 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:35.115373 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:35.115489 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:35.115388 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:35.115662 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:35.115493 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:35.115662 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:35.115603 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:35.244088 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:35.244059 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5ee354-6f03-4d9b-8ef6-57c8988d266c" containerID="f28915776c8f5aba18c9f9aaa0006a24abc3e25f2f6e054346508bc8c0c7db61" exitCode=0 Apr 24 19:07:35.244568 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:35.244120 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhc28" event={"ID":"6f5ee354-6f03-4d9b-8ef6-57c8988d266c","Type":"ContainerDied","Data":"f28915776c8f5aba18c9f9aaa0006a24abc3e25f2f6e054346508bc8c0c7db61"} Apr 24 19:07:37.116261 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:37.116227 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:37.116861 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:37.116317 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:37.116861 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:37.116350 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:37.116861 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:37.116540 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:37.117057 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:37.117033 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:37.117099 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:37.116646 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:39.115857 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.115647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:39.116323 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.115647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:39.116323 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:39.115967 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:07:39.116323 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:39.115996 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r98mn" podUID="fd984433-e82d-4be9-964f-829123c5bb26" Apr 24 19:07:39.116323 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.115647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:39.116323 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:39.116128 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xf9r2" podUID="6f8ef441-4de5-492e-9e4a-1c61639cde69" Apr 24 19:07:39.882370 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.882292 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-214.ec2.internal" event="NodeReady" Apr 24 19:07:39.882598 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.882449 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 19:07:39.934819 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.934785 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fgzls"] Apr 24 19:07:39.962048 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.962015 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-d8rzc"] Apr 24 19:07:39.962222 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.962200 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:39.965968 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.965097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 19:07:39.965968 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.965157 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 19:07:39.965968 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.965407 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d7j6w\"" Apr 24 19:07:39.976147 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.976118 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fgzls"] Apr 24 19:07:39.976147 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.976145 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d8rzc"] Apr 24 19:07:39.976324 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.976248 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:39.978805 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.978722 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 19:07:39.978933 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.978856 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tkcsj\"" Apr 24 19:07:39.978933 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.978891 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 19:07:39.979051 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:39.978972 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 19:07:40.037372 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.037336 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69gvv\" (UniqueName: \"kubernetes.io/projected/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-kube-api-access-69gvv\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.037579 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.037397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-config-volume\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.037579 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.037452 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.037579 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.037544 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-tmp-dir\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.138068 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.137986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-config-volume\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.138068 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.138027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.138068 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.138058 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-tmp-dir\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.138719 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.138092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92f89\" (UniqueName: \"kubernetes.io/projected/429311fb-ef10-40c4-958e-de80bbde38f2-kube-api-access-92f89\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:40.138719 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.138149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:40.138719 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.138162 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:40.138719 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.138208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69gvv\" (UniqueName: \"kubernetes.io/projected/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-kube-api-access-69gvv\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.138719 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.138293 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls podName:c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed nodeName:}" failed. No retries permitted until 2026-04-24 19:07:40.638278343 +0000 UTC m=+34.108437849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls") pod "dns-default-fgzls" (UID: "c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed") : secret "dns-default-metrics-tls" not found Apr 24 19:07:40.138719 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.138480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-tmp-dir\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.139013 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.138945 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-config-volume\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.149640 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.149580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69gvv\" (UniqueName: \"kubernetes.io/projected/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-kube-api-access-69gvv\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.238728 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.238683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92f89\" (UniqueName: \"kubernetes.io/projected/429311fb-ef10-40c4-958e-de80bbde38f2-kube-api-access-92f89\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:40.238913 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.238770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:40.238913 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.238885 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:40.239035 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.238959 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert podName:429311fb-ef10-40c4-958e-de80bbde38f2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:40.738939122 +0000 UTC m=+34.209098629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert") pod "ingress-canary-d8rzc" (UID: "429311fb-ef10-40c4-958e-de80bbde38f2") : secret "canary-serving-cert" not found Apr 24 19:07:40.249780 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.249750 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92f89\" (UniqueName: \"kubernetes.io/projected/429311fb-ef10-40c4-958e-de80bbde38f2-kube-api-access-92f89\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:40.641394 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.641354 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:40.641728 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.641531 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:40.641728 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.641597 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls podName:c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed nodeName:}" failed. No retries permitted until 2026-04-24 19:07:41.641576577 +0000 UTC m=+35.111736126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls") pod "dns-default-fgzls" (UID: "c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed") : secret "dns-default-metrics-tls" not found Apr 24 19:07:40.742624 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.742562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:40.742787 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.742656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:40.742787 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.742723 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:40.742787 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.742743 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:40.742891 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.742791 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert podName:429311fb-ef10-40c4-958e-de80bbde38f2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:41.742778132 +0000 UTC m=+35.212937638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert") pod "ingress-canary-d8rzc" (UID: "429311fb-ef10-40c4-958e-de80bbde38f2") : secret "canary-serving-cert" not found Apr 24 19:07:40.742891 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.742803 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:12.742797706 +0000 UTC m=+66.212957212 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:40.843315 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:40.843279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:40.843484 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.843438 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:40.843484 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.843456 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:40.843484 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.843465 2573 projected.go:194] Error preparing data for projected volume kube-api-access-p88zd for pod openshift-network-diagnostics/network-check-target-r98mn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:40.843586 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:40.843519 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd podName:fd984433-e82d-4be9-964f-829123c5bb26 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:12.843504075 +0000 UTC m=+66.313663581 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-p88zd" (UniqueName: "kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd") pod "network-check-target-r98mn" (UID: "fd984433-e82d-4be9-964f-829123c5bb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:41.116102 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.116021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:07:41.116250 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.116021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:07:41.116250 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.116021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:41.119858 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.119801 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:07:41.119858 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.119828 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:07:41.119858 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.119842 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 19:07:41.119858 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.119854 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-72mrf\"" Apr 24 19:07:41.120123 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.119893 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h6pgv\"" Apr 24 19:07:41.120123 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.119858 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:07:41.650501 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.650414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:41.650936 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:41.650556 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:41.650936 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:41.650637 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls podName:c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed nodeName:}" failed. No retries permitted until 2026-04-24 19:07:43.650622332 +0000 UTC m=+37.120781851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls") pod "dns-default-fgzls" (UID: "c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed") : secret "dns-default-metrics-tls" not found Apr 24 19:07:41.750900 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:41.750859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:41.751059 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:41.750998 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:41.751103 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:41.751060 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert podName:429311fb-ef10-40c4-958e-de80bbde38f2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:43.751046121 +0000 UTC m=+37.221205627 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert") pod "ingress-canary-d8rzc" (UID: "429311fb-ef10-40c4-958e-de80bbde38f2") : secret "canary-serving-cert" not found Apr 24 19:07:42.259969 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:42.259938 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5ee354-6f03-4d9b-8ef6-57c8988d266c" containerID="e58172c440df02e95fc3cd36e7aebcadb2349298c82c4bd31fa2723093d3edfd" exitCode=0 Apr 24 19:07:42.260147 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:42.259990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhc28" event={"ID":"6f5ee354-6f03-4d9b-8ef6-57c8988d266c","Type":"ContainerDied","Data":"e58172c440df02e95fc3cd36e7aebcadb2349298c82c4bd31fa2723093d3edfd"} Apr 24 19:07:43.264666 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:43.264627 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5ee354-6f03-4d9b-8ef6-57c8988d266c" containerID="b73bb89a4030bdfbcef1bf5153cb9d7e617489de627a8af2c01f42f611218d6b" exitCode=0 Apr 24 19:07:43.265061 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:43.264683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhc28" event={"ID":"6f5ee354-6f03-4d9b-8ef6-57c8988d266c","Type":"ContainerDied","Data":"b73bb89a4030bdfbcef1bf5153cb9d7e617489de627a8af2c01f42f611218d6b"} Apr 24 19:07:43.666232 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:43.666010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:43.666419 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:43.666159 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:43.666419 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:43.666311 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls podName:c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed nodeName:}" failed. No retries permitted until 2026-04-24 19:07:47.666293475 +0000 UTC m=+41.136452985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls") pod "dns-default-fgzls" (UID: "c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed") : secret "dns-default-metrics-tls" not found Apr 24 19:07:43.766860 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:43.766824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:43.767040 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:43.766992 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:43.767103 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:43.767077 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert podName:429311fb-ef10-40c4-958e-de80bbde38f2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:47.767054628 +0000 UTC m=+41.237214150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert") pod "ingress-canary-d8rzc" (UID: "429311fb-ef10-40c4-958e-de80bbde38f2") : secret "canary-serving-cert" not found Apr 24 19:07:44.269418 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:44.269381 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhc28" event={"ID":"6f5ee354-6f03-4d9b-8ef6-57c8988d266c","Type":"ContainerStarted","Data":"64eeeadd8a3f438912c758ee3e50767cc485920315231d216b79f12b429d0d55"} Apr 24 19:07:44.297310 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:44.297261 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zhc28" podStartSLOduration=6.004718078 podStartE2EDuration="37.297248263s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.896725356 +0000 UTC m=+3.366884866" lastFinishedPulling="2026-04-24 19:07:41.189255545 +0000 UTC m=+34.659415051" observedRunningTime="2026-04-24 19:07:44.295846213 +0000 UTC m=+37.766005743" watchObservedRunningTime="2026-04-24 19:07:44.297248263 +0000 UTC m=+37.767407790" Apr 24 19:07:47.695762 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:47.695725 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:47.696278 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:47.695871 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:47.696278 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:47.695946 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls podName:c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed nodeName:}" failed. No retries permitted until 2026-04-24 19:07:55.695925198 +0000 UTC m=+49.166084707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls") pod "dns-default-fgzls" (UID: "c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed") : secret "dns-default-metrics-tls" not found Apr 24 19:07:47.796699 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:47.796665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:47.796870 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:47.796829 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:47.796931 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:47.796904 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert podName:429311fb-ef10-40c4-958e-de80bbde38f2 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:55.796881633 +0000 UTC m=+49.267041154 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert") pod "ingress-canary-d8rzc" (UID: "429311fb-ef10-40c4-958e-de80bbde38f2") : secret "canary-serving-cert" not found Apr 24 19:07:47.998550 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:47.998460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:48.001839 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:48.001815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6f8ef441-4de5-492e-9e4a-1c61639cde69-original-pull-secret\") pod \"global-pull-secret-syncer-xf9r2\" (UID: \"6f8ef441-4de5-492e-9e4a-1c61639cde69\") " pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:48.037115 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:48.037082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xf9r2" Apr 24 19:07:48.158361 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:48.158337 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xf9r2"] Apr 24 19:07:48.161513 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:07:48.161484 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8ef441_4de5_492e_9e4a_1c61639cde69.slice/crio-fb2ef4e261634b83e4c30496a6e63f7bee834c98ced435fec0389365913dadad WatchSource:0}: Error finding container fb2ef4e261634b83e4c30496a6e63f7bee834c98ced435fec0389365913dadad: Status 404 returned error can't find the container with id fb2ef4e261634b83e4c30496a6e63f7bee834c98ced435fec0389365913dadad Apr 24 19:07:48.278341 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:48.278275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xf9r2" event={"ID":"6f8ef441-4de5-492e-9e4a-1c61639cde69","Type":"ContainerStarted","Data":"fb2ef4e261634b83e4c30496a6e63f7bee834c98ced435fec0389365913dadad"} Apr 24 19:07:53.289754 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:53.289717 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xf9r2" event={"ID":"6f8ef441-4de5-492e-9e4a-1c61639cde69","Type":"ContainerStarted","Data":"5d0eb6a0aef5d1362b1ae6eb92bda8dd03f4cf7ecb26e522099fb64e595a508c"} Apr 24 19:07:53.303390 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:53.303293 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xf9r2" podStartSLOduration=33.159145153 podStartE2EDuration="37.303278736s" podCreationTimestamp="2026-04-24 19:07:16 +0000 UTC" firstStartedPulling="2026-04-24 19:07:48.163307033 +0000 UTC m=+41.633466539" lastFinishedPulling="2026-04-24 19:07:52.307440598 +0000 UTC m=+45.777600122" observedRunningTime="2026-04-24 19:07:53.302938851 +0000 UTC m=+46.773098378" watchObservedRunningTime="2026-04-24 19:07:53.303278736 +0000 UTC m=+46.773438263" Apr 24 19:07:55.750756 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:55.750718 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:07:55.751206 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:55.750860 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:55.751206 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:55.750935 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls podName:c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed nodeName:}" failed. No retries permitted until 2026-04-24 19:08:11.750916457 +0000 UTC m=+65.221075963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls") pod "dns-default-fgzls" (UID: "c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed") : secret "dns-default-metrics-tls" not found Apr 24 19:07:55.851406 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:07:55.851372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:07:55.851545 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:55.851515 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:55.851629 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:07:55.851581 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert podName:429311fb-ef10-40c4-958e-de80bbde38f2 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:11.85156634 +0000 UTC m=+65.321725846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert") pod "ingress-canary-d8rzc" (UID: "429311fb-ef10-40c4-958e-de80bbde38f2") : secret "canary-serving-cert" not found Apr 24 19:08:04.251248 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:04.251213 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k6tsb" Apr 24 19:08:11.850820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:11.850767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:08:11.851215 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:11.850923 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:08:11.851215 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:11.850985 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls podName:c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed nodeName:}" failed. No retries permitted until 2026-04-24 19:08:43.850968662 +0000 UTC m=+97.321128185 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls") pod "dns-default-fgzls" (UID: "c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed") : secret "dns-default-metrics-tls" not found Apr 24 19:08:11.951620 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:11.951558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:08:11.951785 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:11.951708 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:08:11.951785 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:11.951775 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert podName:429311fb-ef10-40c4-958e-de80bbde38f2 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:43.951760162 +0000 UTC m=+97.421919668 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert") pod "ingress-canary-d8rzc" (UID: "429311fb-ef10-40c4-958e-de80bbde38f2") : secret "canary-serving-cert" not found Apr 24 19:08:12.757244 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:12.757200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:08:12.759683 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:12.759665 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:08:12.767679 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:12.767661 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:08:12.767728 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:12.767722 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:16.767706678 +0000 UTC m=+130.237866184 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : secret "metrics-daemon-secret" not found Apr 24 19:08:12.857499 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:12.857469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:08:12.860051 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:12.860036 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:08:12.869817 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:12.869800 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:08:12.881012 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:12.880990 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p88zd\" (UniqueName: \"kubernetes.io/projected/fd984433-e82d-4be9-964f-829123c5bb26-kube-api-access-p88zd\") pod \"network-check-target-r98mn\" (UID: \"fd984433-e82d-4be9-964f-829123c5bb26\") " pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:08:12.927955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:12.927937 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-72mrf\"" Apr 24 19:08:12.936868 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:12.936853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:08:13.060853 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:13.060828 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r98mn"] Apr 24 19:08:13.065822 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:08:13.065800 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd984433_e82d_4be9_964f_829123c5bb26.slice/crio-331e46f36c00f51d408b381831fcb763c9bc5d1ae1ee2a05cdd6a2f03499d548 WatchSource:0}: Error finding container 331e46f36c00f51d408b381831fcb763c9bc5d1ae1ee2a05cdd6a2f03499d548: Status 404 returned error can't find the container with id 331e46f36c00f51d408b381831fcb763c9bc5d1ae1ee2a05cdd6a2f03499d548 Apr 24 19:08:13.325989 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:13.325907 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r98mn" event={"ID":"fd984433-e82d-4be9-964f-829123c5bb26","Type":"ContainerStarted","Data":"331e46f36c00f51d408b381831fcb763c9bc5d1ae1ee2a05cdd6a2f03499d548"} Apr 24 19:08:16.333261 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:16.333225 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r98mn" event={"ID":"fd984433-e82d-4be9-964f-829123c5bb26","Type":"ContainerStarted","Data":"a66c72eac4bd08e778047b2f4f4753ecdca611926465bfadad7c40c56ffd0036"} Apr 24 19:08:16.333637 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:16.333346 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:08:16.347408 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:16.347372 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-r98mn" podStartSLOduration=66.28026079 podStartE2EDuration="1m9.347357319s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:08:13.069400007 +0000 UTC m=+66.539559516" lastFinishedPulling="2026-04-24 19:08:16.136496539 +0000 UTC m=+69.606656045" observedRunningTime="2026-04-24 19:08:16.347127445 +0000 UTC m=+69.817286994" watchObservedRunningTime="2026-04-24 19:08:16.347357319 +0000 UTC m=+69.817516827" Apr 24 19:08:43.857565 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:43.857536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:08:43.857949 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:43.857674 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:08:43.857949 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:43.857733 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls podName:c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed nodeName:}" failed. No retries permitted until 2026-04-24 19:09:47.857720036 +0000 UTC m=+161.327879541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls") pod "dns-default-fgzls" (UID: "c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed") : secret "dns-default-metrics-tls" not found Apr 24 19:08:43.957975 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:43.957931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:08:43.958099 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:43.958069 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:08:43.958149 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:08:43.958137 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert podName:429311fb-ef10-40c4-958e-de80bbde38f2 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:47.958122262 +0000 UTC m=+161.428281768 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert") pod "ingress-canary-d8rzc" (UID: "429311fb-ef10-40c4-958e-de80bbde38f2") : secret "canary-serving-cert" not found Apr 24 19:08:47.338729 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:08:47.338699 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-r98mn" Apr 24 19:09:14.702054 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.702005 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4"] Apr 24 19:09:14.704711 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.704691 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-54854595f4-ljrqx"] Apr 24 19:09:14.704865 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.704843 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.707207 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.707187 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.707656 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.707629 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 19:09:14.707763 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.707750 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-584t9\"" Apr 24 19:09:14.710317 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.710295 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 19:09:14.712376 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.712074 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 19:09:14.712376 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.712200 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 19:09:14.712376 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.712235 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 19:09:14.712376 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.712241 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 19:09:14.712376 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.712253 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vsv9w\"" Apr 24 19:09:14.712376 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.712246 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 19:09:14.712376 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.712321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 19:09:14.712818 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.712801 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 19:09:14.718843 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.718821 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 19:09:14.729318 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.729283 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4"] Apr 24 19:09:14.730264 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.730240 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-54854595f4-ljrqx"] Apr 24 19:09:14.765583 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.765546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-default-certificate\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.765783 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.765674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.765783 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.765707 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkvqq\" (UniqueName: \"kubernetes.io/projected/36d3fa5b-8237-4810-9120-a6a9421e039b-kube-api-access-lkvqq\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.765783 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.765731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzb87\" (UniqueName: \"kubernetes.io/projected/510e887c-a3b0-44b9-b21c-9137b720d224-kube-api-access-pzb87\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.765886 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.765797 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.765886 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.765832 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-stats-auth\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.765886 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.765859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/36d3fa5b-8237-4810-9120-a6a9421e039b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.765886 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.765875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.799207 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.799169 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-545tt"] Apr 24 19:09:14.802067 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.802049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.804918 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.804891 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 19:09:14.804918 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.804906 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:09:14.804918 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.804893 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 19:09:14.805199 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.804941 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-n7kkq\"" Apr 24 19:09:14.805333 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.805317 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 19:09:14.810502 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.810476 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 19:09:14.816875 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.816847 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-545tt"] Apr 24 19:09:14.867195 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.867195 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-stats-auth\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.867432 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867224 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/36d3fa5b-8237-4810-9120-a6a9421e039b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.867432 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.867432 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867277 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48af1cfa-4ff5-4543-8939-d43ac71b40ad-serving-cert\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.867432 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867300 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48af1cfa-4ff5-4543-8939-d43ac71b40ad-trusted-ca\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.867432 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:14.867320 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:14.867432 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-default-certificate\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.867432 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:14.867418 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:15.367391757 +0000 UTC m=+128.837551283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : configmap references non-existent config key: service-ca.crt Apr 24 19:09:14.867775 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhxwz\" (UniqueName: \"kubernetes.io/projected/48af1cfa-4ff5-4543-8939-d43ac71b40ad-kube-api-access-qhxwz\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.867775 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.867775 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:14.867654 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls podName:36d3fa5b-8237-4810-9120-a6a9421e039b nodeName:}" failed. No retries permitted until 2026-04-24 19:09:15.3676341 +0000 UTC m=+128.837793610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4lvn4" (UID: "36d3fa5b-8237-4810-9120-a6a9421e039b") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:14.867775 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:14.867682 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:09:14.867775 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867743 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkvqq\" (UniqueName: \"kubernetes.io/projected/36d3fa5b-8237-4810-9120-a6a9421e039b-kube-api-access-lkvqq\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.868004 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:14.867785 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:15.367756443 +0000 UTC m=+128.837915964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : secret "router-metrics-certs-default" not found Apr 24 19:09:14.868004 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzb87\" (UniqueName: \"kubernetes.io/projected/510e887c-a3b0-44b9-b21c-9137b720d224-kube-api-access-pzb87\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.868004 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.867863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48af1cfa-4ff5-4543-8939-d43ac71b40ad-config\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.868408 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.868386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/36d3fa5b-8237-4810-9120-a6a9421e039b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.869985 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.869963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-default-certificate\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.870041 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.870008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-stats-auth\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.880154 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.880118 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkvqq\" (UniqueName: \"kubernetes.io/projected/36d3fa5b-8237-4810-9120-a6a9421e039b-kube-api-access-lkvqq\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:14.880288 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.880169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzb87\" (UniqueName: \"kubernetes.io/projected/510e887c-a3b0-44b9-b21c-9137b720d224-kube-api-access-pzb87\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:14.968552 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.968456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48af1cfa-4ff5-4543-8939-d43ac71b40ad-serving-cert\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.968552 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.968496 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48af1cfa-4ff5-4543-8939-d43ac71b40ad-trusted-ca\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.968552 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.968540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhxwz\" (UniqueName: \"kubernetes.io/projected/48af1cfa-4ff5-4543-8939-d43ac71b40ad-kube-api-access-qhxwz\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.968841 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.968582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48af1cfa-4ff5-4543-8939-d43ac71b40ad-config\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.969317 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.969282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48af1cfa-4ff5-4543-8939-d43ac71b40ad-config\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.969706 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.969688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48af1cfa-4ff5-4543-8939-d43ac71b40ad-trusted-ca\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.971113 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.971093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48af1cfa-4ff5-4543-8939-d43ac71b40ad-serving-cert\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:14.980898 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:14.980873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhxwz\" (UniqueName: \"kubernetes.io/projected/48af1cfa-4ff5-4543-8939-d43ac71b40ad-kube-api-access-qhxwz\") pod \"console-operator-9d4b6777b-545tt\" (UID: \"48af1cfa-4ff5-4543-8939-d43ac71b40ad\") " pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:15.111852 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:15.111809 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:15.231633 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:15.231534 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-545tt"] Apr 24 19:09:15.234965 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:15.234926 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48af1cfa_4ff5_4543_8939_d43ac71b40ad.slice/crio-77f9b947d8a8cf2c9e410c3d94c8ed4c363f3e07d8b9e477daceef21a1eda6a0 WatchSource:0}: Error finding container 77f9b947d8a8cf2c9e410c3d94c8ed4c363f3e07d8b9e477daceef21a1eda6a0: Status 404 returned error can't find the container with id 77f9b947d8a8cf2c9e410c3d94c8ed4c363f3e07d8b9e477daceef21a1eda6a0 Apr 24 19:09:15.372398 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:15.372352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:15.372398 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:15.372396 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:15.372669 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:15.372425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:15.372669 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:15.372453 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls podName:36d3fa5b-8237-4810-9120-a6a9421e039b nodeName:}" failed. No retries permitted until 2026-04-24 19:09:16.372439303 +0000 UTC m=+129.842598809 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4lvn4" (UID: "36d3fa5b-8237-4810-9120-a6a9421e039b") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:15.372669 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:15.372520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:15.372669 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:15.372562 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:09:15.372669 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:15.372574 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:16.3725656 +0000 UTC m=+129.842725106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : configmap references non-existent config key: service-ca.crt Apr 24 19:09:15.372669 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:15.372596 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:16.37258508 +0000 UTC m=+129.842744587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : secret "router-metrics-certs-default" not found Apr 24 19:09:15.447065 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:15.447029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" event={"ID":"48af1cfa-4ff5-4543-8939-d43ac71b40ad","Type":"ContainerStarted","Data":"77f9b947d8a8cf2c9e410c3d94c8ed4c363f3e07d8b9e477daceef21a1eda6a0"} Apr 24 19:09:16.267695 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.267660 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m"] Apr 24 19:09:16.270889 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.270865 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m" Apr 24 19:09:16.273595 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.273569 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-74tdl\"" Apr 24 19:09:16.279891 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.279566 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m"] Apr 24 19:09:16.381737 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.381701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:16.381909 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.381752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:16.381909 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.381786 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wgz\" (UniqueName: \"kubernetes.io/projected/9a5a78d0-d426-447b-ab33-a09e0d9966e1-kube-api-access-87wgz\") pod \"network-check-source-8894fc9bd-lh97m\" (UID: \"9a5a78d0-d426-447b-ab33-a09e0d9966e1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m" Apr 24 19:09:16.381909 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:16.381856 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:16.381909 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:16.381879 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:18.381857016 +0000 UTC m=+131.852016522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : configmap references non-existent config key: service-ca.crt Apr 24 19:09:16.381909 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.381909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:16.382158 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:16.381955 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls podName:36d3fa5b-8237-4810-9120-a6a9421e039b nodeName:}" failed. No retries permitted until 2026-04-24 19:09:18.38194573 +0000 UTC m=+131.852105237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4lvn4" (UID: "36d3fa5b-8237-4810-9120-a6a9421e039b") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:16.382158 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:16.381983 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:09:16.382158 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:16.382030 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:18.382018163 +0000 UTC m=+131.852177673 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : secret "router-metrics-certs-default" not found Apr 24 19:09:16.483066 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.483022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87wgz\" (UniqueName: \"kubernetes.io/projected/9a5a78d0-d426-447b-ab33-a09e0d9966e1-kube-api-access-87wgz\") pod \"network-check-source-8894fc9bd-lh97m\" (UID: \"9a5a78d0-d426-447b-ab33-a09e0d9966e1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m" Apr 24 19:09:16.493349 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.493312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wgz\" (UniqueName: \"kubernetes.io/projected/9a5a78d0-d426-447b-ab33-a09e0d9966e1-kube-api-access-87wgz\") pod \"network-check-source-8894fc9bd-lh97m\" (UID: \"9a5a78d0-d426-447b-ab33-a09e0d9966e1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m" Apr 24 19:09:16.582655 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.582545 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m" Apr 24 19:09:16.784852 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.784809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:09:16.785051 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:16.784984 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:09:16.785119 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:16.785074 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs podName:8a81da49-19b8-407f-a961-d85a0ec045e1 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:18.785048093 +0000 UTC m=+252.255207605 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs") pod "network-metrics-daemon-cr4ls" (UID: "8a81da49-19b8-407f-a961-d85a0ec045e1") : secret "metrics-daemon-secret" not found Apr 24 19:09:16.994455 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:16.994423 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m"] Apr 24 19:09:16.997786 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:16.997753 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a5a78d0_d426_447b_ab33_a09e0d9966e1.slice/crio-1c7a2022379540227d62afc05b2e628cb5fb087cb0dc11c6da7a1ce4ddbf86f3 WatchSource:0}: Error finding container 1c7a2022379540227d62afc05b2e628cb5fb087cb0dc11c6da7a1ce4ddbf86f3: Status 404 returned error can't find the container with id 1c7a2022379540227d62afc05b2e628cb5fb087cb0dc11c6da7a1ce4ddbf86f3 Apr 24 19:09:17.453097 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:17.453071 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/0.log" Apr 24 19:09:17.453537 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:17.453109 2573 generic.go:358] "Generic (PLEG): container finished" podID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" containerID="f2d16a5f4eb4a621094af0b4d24f9956d92f4a19f663770f3d93e04a34799668" exitCode=255 Apr 24 19:09:17.453537 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:17.453173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" event={"ID":"48af1cfa-4ff5-4543-8939-d43ac71b40ad","Type":"ContainerDied","Data":"f2d16a5f4eb4a621094af0b4d24f9956d92f4a19f663770f3d93e04a34799668"} Apr 24 19:09:17.453537 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:17.453513 2573 scope.go:117] "RemoveContainer" containerID="f2d16a5f4eb4a621094af0b4d24f9956d92f4a19f663770f3d93e04a34799668" Apr 24 19:09:17.454580 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:17.454553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m" event={"ID":"9a5a78d0-d426-447b-ab33-a09e0d9966e1","Type":"ContainerStarted","Data":"9c0d5691f3d6a1e72bff45802ca28b07c462b8c57e9c0bdefec2116c5bdcd5c2"} Apr 24 19:09:17.454722 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:17.454585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m" event={"ID":"9a5a78d0-d426-447b-ab33-a09e0d9966e1","Type":"ContainerStarted","Data":"1c7a2022379540227d62afc05b2e628cb5fb087cb0dc11c6da7a1ce4ddbf86f3"} Apr 24 19:09:17.502547 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:17.502498 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lh97m" podStartSLOduration=1.502483234 podStartE2EDuration="1.502483234s" podCreationTimestamp="2026-04-24 19:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:17.501206613 +0000 UTC m=+130.971366142" watchObservedRunningTime="2026-04-24 19:09:17.502483234 +0000 UTC m=+130.972642793" Apr 24 19:09:18.398845 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.398800 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:18.399094 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.398862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:18.399094 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.398884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:18.399094 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:18.398960 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:09:18.399094 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:18.399012 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:18.399094 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:18.399039 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:22.39901808 +0000 UTC m=+135.869177588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : secret "router-metrics-certs-default" not found Apr 24 19:09:18.399094 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:18.399059 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls podName:36d3fa5b-8237-4810-9120-a6a9421e039b nodeName:}" failed. No retries permitted until 2026-04-24 19:09:22.39904692 +0000 UTC m=+135.869206426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4lvn4" (UID: "36d3fa5b-8237-4810-9120-a6a9421e039b") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:18.399094 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:18.399073 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:22.399067414 +0000 UTC m=+135.869226920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : configmap references non-existent config key: service-ca.crt Apr 24 19:09:18.458951 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.458925 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/1.log" Apr 24 19:09:18.459359 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.459310 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/0.log" Apr 24 19:09:18.459359 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.459344 2573 generic.go:358] "Generic (PLEG): container finished" podID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" containerID="53bf93291537cea52093d4aa275911243596d6b402cedabb2905ba61e2b956a4" exitCode=255 Apr 24 19:09:18.459446 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.459426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" event={"ID":"48af1cfa-4ff5-4543-8939-d43ac71b40ad","Type":"ContainerDied","Data":"53bf93291537cea52093d4aa275911243596d6b402cedabb2905ba61e2b956a4"} Apr 24 19:09:18.459482 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.459469 2573 scope.go:117] "RemoveContainer" containerID="f2d16a5f4eb4a621094af0b4d24f9956d92f4a19f663770f3d93e04a34799668" Apr 24 19:09:18.459719 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:18.459682 2573 scope.go:117] "RemoveContainer" containerID="53bf93291537cea52093d4aa275911243596d6b402cedabb2905ba61e2b956a4" Apr 24 19:09:18.459924 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:18.459901 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-545tt_openshift-console-operator(48af1cfa-4ff5-4543-8939-d43ac71b40ad)\"" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" podUID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" Apr 24 19:09:19.462533 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:19.462485 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/1.log" Apr 24 19:09:19.462955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:19.462889 2573 scope.go:117] "RemoveContainer" containerID="53bf93291537cea52093d4aa275911243596d6b402cedabb2905ba61e2b956a4" Apr 24 19:09:19.463053 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:19.463042 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-545tt_openshift-console-operator(48af1cfa-4ff5-4543-8939-d43ac71b40ad)\"" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" podUID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" Apr 24 19:09:20.140598 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.140557 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr"] Apr 24 19:09:20.143838 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.143811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" Apr 24 19:09:20.146944 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.146916 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 19:09:20.147545 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.147521 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6r8bw\"" Apr 24 19:09:20.148930 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.148915 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 19:09:20.163930 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.163899 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr"] Apr 24 19:09:20.213333 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.213295 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57tp4\" (UniqueName: \"kubernetes.io/projected/d40c19b7-e7a8-4514-af70-6b73c6866411-kube-api-access-57tp4\") pod \"migrator-74bb7799d9-vjxtr\" (UID: \"d40c19b7-e7a8-4514-af70-6b73c6866411\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" Apr 24 19:09:20.313735 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.313702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57tp4\" (UniqueName: \"kubernetes.io/projected/d40c19b7-e7a8-4514-af70-6b73c6866411-kube-api-access-57tp4\") pod \"migrator-74bb7799d9-vjxtr\" (UID: \"d40c19b7-e7a8-4514-af70-6b73c6866411\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" Apr 24 19:09:20.332587 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.332550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57tp4\" (UniqueName: \"kubernetes.io/projected/d40c19b7-e7a8-4514-af70-6b73c6866411-kube-api-access-57tp4\") pod \"migrator-74bb7799d9-vjxtr\" (UID: \"d40c19b7-e7a8-4514-af70-6b73c6866411\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" Apr 24 19:09:20.452952 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.452914 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" Apr 24 19:09:20.588843 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.588808 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr"] Apr 24 19:09:20.592726 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:20.592682 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd40c19b7_e7a8_4514_af70_6b73c6866411.slice/crio-919715f298f941e53c6e303e664468196373d8663af81383aa95f9a7af2ca96d WatchSource:0}: Error finding container 919715f298f941e53c6e303e664468196373d8663af81383aa95f9a7af2ca96d: Status 404 returned error can't find the container with id 919715f298f941e53c6e303e664468196373d8663af81383aa95f9a7af2ca96d Apr 24 19:09:20.766802 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:20.766722 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bk6r2_79c1d341-2bed-41fe-b49c-3f1de4604feb/dns-node-resolver/0.log" Apr 24 19:09:21.468371 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:21.468329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" event={"ID":"d40c19b7-e7a8-4514-af70-6b73c6866411","Type":"ContainerStarted","Data":"919715f298f941e53c6e303e664468196373d8663af81383aa95f9a7af2ca96d"} Apr 24 19:09:21.759014 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:21.758986 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2p82x_9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92/node-ca/0.log" Apr 24 19:09:22.430711 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:22.430641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:22.430711 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:22.430717 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 19:09:22.430963 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:22.430743 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:22.430963 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:22.430775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:22.430963 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:22.430793 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:30.430771939 +0000 UTC m=+143.900931446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : secret "router-metrics-certs-default" not found Apr 24 19:09:22.430963 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:22.430870 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle podName:510e887c-a3b0-44b9-b21c-9137b720d224 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:30.430854715 +0000 UTC m=+143.901014237 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle") pod "router-default-54854595f4-ljrqx" (UID: "510e887c-a3b0-44b9-b21c-9137b720d224") : configmap references non-existent config key: service-ca.crt Apr 24 19:09:22.430963 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:22.430878 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:22.430963 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:22.430921 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls podName:36d3fa5b-8237-4810-9120-a6a9421e039b nodeName:}" failed. No retries permitted until 2026-04-24 19:09:30.430909497 +0000 UTC m=+143.901069019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4lvn4" (UID: "36d3fa5b-8237-4810-9120-a6a9421e039b") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:22.472849 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:22.472816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" event={"ID":"d40c19b7-e7a8-4514-af70-6b73c6866411","Type":"ContainerStarted","Data":"81aed15dfb1b0a37b47b594e4cedb0e8425c1c1a860fb6136da33ad4f97338d7"} Apr 24 19:09:22.473024 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:22.472856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" event={"ID":"d40c19b7-e7a8-4514-af70-6b73c6866411","Type":"ContainerStarted","Data":"04f0870dbc789e52299ed915f630b1eea398761df9b29c5bc9725c9237825434"} Apr 24 19:09:22.495912 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:22.495850 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vjxtr" podStartSLOduration=1.462598251 podStartE2EDuration="2.495833462s" podCreationTimestamp="2026-04-24 19:09:20 +0000 UTC" firstStartedPulling="2026-04-24 19:09:20.594573663 +0000 UTC m=+134.064733170" lastFinishedPulling="2026-04-24 19:09:21.627808861 +0000 UTC m=+135.097968381" observedRunningTime="2026-04-24 19:09:22.494448548 +0000 UTC m=+135.964608076" watchObservedRunningTime="2026-04-24 19:09:22.495833462 +0000 UTC m=+135.965992989" Apr 24 19:09:25.112686 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:25.112637 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:25.112686 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:25.112686 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:25.113117 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:25.113034 2573 scope.go:117] "RemoveContainer" containerID="53bf93291537cea52093d4aa275911243596d6b402cedabb2905ba61e2b956a4" Apr 24 19:09:25.113200 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:25.113186 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-545tt_openshift-console-operator(48af1cfa-4ff5-4543-8939-d43ac71b40ad)\"" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" podUID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" Apr 24 19:09:30.494343 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:30.494307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:30.494343 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:30.494349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:30.494955 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:30.494402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:30.494955 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:30.494446 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:30.494955 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:30.494507 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls podName:36d3fa5b-8237-4810-9120-a6a9421e039b nodeName:}" failed. No retries permitted until 2026-04-24 19:09:46.494491251 +0000 UTC m=+159.964650757 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4lvn4" (UID: "36d3fa5b-8237-4810-9120-a6a9421e039b") : secret "cluster-monitoring-operator-tls" not found Apr 24 19:09:30.495157 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:30.495133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510e887c-a3b0-44b9-b21c-9137b720d224-service-ca-bundle\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:30.496775 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:30.496757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/510e887c-a3b0-44b9-b21c-9137b720d224-metrics-certs\") pod \"router-default-54854595f4-ljrqx\" (UID: \"510e887c-a3b0-44b9-b21c-9137b720d224\") " pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:30.620892 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:30.620848 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:30.777387 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:30.777361 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-54854595f4-ljrqx"] Apr 24 19:09:30.779736 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:30.779707 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod510e887c_a3b0_44b9_b21c_9137b720d224.slice/crio-9f3f58997ff701db154c3361ae0aa969b610be6cf0ebc0eb25d59ffafe941c31 WatchSource:0}: Error finding container 9f3f58997ff701db154c3361ae0aa969b610be6cf0ebc0eb25d59ffafe941c31: Status 404 returned error can't find the container with id 9f3f58997ff701db154c3361ae0aa969b610be6cf0ebc0eb25d59ffafe941c31 Apr 24 19:09:31.496481 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:31.496443 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54854595f4-ljrqx" event={"ID":"510e887c-a3b0-44b9-b21c-9137b720d224","Type":"ContainerStarted","Data":"c608f480ffcd5fcb170c5346022d431966691bdad202d7d190cf8b5ae4c8d0d2"} Apr 24 19:09:31.496481 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:31.496483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54854595f4-ljrqx" event={"ID":"510e887c-a3b0-44b9-b21c-9137b720d224","Type":"ContainerStarted","Data":"9f3f58997ff701db154c3361ae0aa969b610be6cf0ebc0eb25d59ffafe941c31"} Apr 24 19:09:31.621881 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:31.621841 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:31.624423 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:31.624397 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:31.653782 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:31.653723 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-54854595f4-ljrqx" podStartSLOduration=17.653703713 podStartE2EDuration="17.653703713s" podCreationTimestamp="2026-04-24 19:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:31.521410764 +0000 UTC m=+144.991570303" watchObservedRunningTime="2026-04-24 19:09:31.653703713 +0000 UTC m=+145.123863242" Apr 24 19:09:32.498751 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:32.498719 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:32.500059 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:32.500036 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-54854595f4-ljrqx" Apr 24 19:09:40.115808 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:40.115771 2573 scope.go:117] "RemoveContainer" containerID="53bf93291537cea52093d4aa275911243596d6b402cedabb2905ba61e2b956a4" Apr 24 19:09:40.518119 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:40.518088 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:09:40.518435 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:40.518420 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/1.log" Apr 24 19:09:40.518495 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:40.518451 2573 generic.go:358] "Generic (PLEG): container finished" podID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" containerID="317ebe42e0b421445b143661669fc29c7b86d2766e9aee549e82503ac5ec0642" exitCode=255 Apr 24 19:09:40.518555 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:40.518535 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" event={"ID":"48af1cfa-4ff5-4543-8939-d43ac71b40ad","Type":"ContainerDied","Data":"317ebe42e0b421445b143661669fc29c7b86d2766e9aee549e82503ac5ec0642"} Apr 24 19:09:40.518588 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:40.518581 2573 scope.go:117] "RemoveContainer" containerID="53bf93291537cea52093d4aa275911243596d6b402cedabb2905ba61e2b956a4" Apr 24 19:09:40.518909 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:40.518881 2573 scope.go:117] "RemoveContainer" containerID="317ebe42e0b421445b143661669fc29c7b86d2766e9aee549e82503ac5ec0642" Apr 24 19:09:40.519088 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:40.519070 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-545tt_openshift-console-operator(48af1cfa-4ff5-4543-8939-d43ac71b40ad)\"" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" podUID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" Apr 24 19:09:41.524075 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:41.524048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:09:42.986320 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:42.986271 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-d8rzc" podUID="429311fb-ef10-40c4-958e-de80bbde38f2" Apr 24 19:09:42.987368 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:42.987340 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fgzls" podUID="c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed" Apr 24 19:09:43.528948 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:43.528920 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fgzls" Apr 24 19:09:44.131691 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:44.131633 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cr4ls" podUID="8a81da49-19b8-407f-a961-d85a0ec045e1" Apr 24 19:09:45.112336 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:45.112281 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:45.112336 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:45.112336 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:09:45.112725 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:45.112709 2573 scope.go:117] "RemoveContainer" containerID="317ebe42e0b421445b143661669fc29c7b86d2766e9aee549e82503ac5ec0642" Apr 24 19:09:45.112900 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:45.112881 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-545tt_openshift-console-operator(48af1cfa-4ff5-4543-8939-d43ac71b40ad)\"" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" podUID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" Apr 24 19:09:46.367414 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.367382 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ncvw4"] Apr 24 19:09:46.371569 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.371550 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.379560 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.379531 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 19:09:46.379807 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.379531 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 19:09:46.379887 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.379573 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fsgkc\"" Apr 24 19:09:46.379887 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.379595 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 19:09:46.380001 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.379665 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 19:09:46.405316 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.405276 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ncvw4"] Apr 24 19:09:46.514862 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.514818 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-crio-socket\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.515038 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.514935 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-data-volume\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.515038 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.514977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.515038 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.515002 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bvvw\" (UniqueName: \"kubernetes.io/projected/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-kube-api-access-5bvvw\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.515038 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.515027 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.515164 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.515097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:46.517510 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.517479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/36d3fa5b-8237-4810-9120-a6a9421e039b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4lvn4\" (UID: \"36d3fa5b-8237-4810-9120-a6a9421e039b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:46.616415 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.616321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.616415 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.616373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bvvw\" (UniqueName: \"kubernetes.io/projected/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-kube-api-access-5bvvw\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.616415 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.616403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.616683 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.616469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-crio-socket\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.616683 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.616537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-data-volume\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.616683 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.616638 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-crio-socket\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.616916 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.616891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-data-volume\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.617771 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.617751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.619016 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.618992 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.628572 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.628544 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bvvw\" (UniqueName: \"kubernetes.io/projected/0844d470-2d66-44fe-8ddb-05c8d01dc2c8-kube-api-access-5bvvw\") pod \"insights-runtime-extractor-ncvw4\" (UID: \"0844d470-2d66-44fe-8ddb-05c8d01dc2c8\") " pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.681568 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.681519 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ncvw4" Apr 24 19:09:46.814795 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.814763 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ncvw4"] Apr 24 19:09:46.815013 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.814840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" Apr 24 19:09:46.820138 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:46.820101 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0844d470_2d66_44fe_8ddb_05c8d01dc2c8.slice/crio-c429b376ad407c79632b165d64cd5ea21ab741453c935001248375e0e068cf74 WatchSource:0}: Error finding container c429b376ad407c79632b165d64cd5ea21ab741453c935001248375e0e068cf74: Status 404 returned error can't find the container with id c429b376ad407c79632b165d64cd5ea21ab741453c935001248375e0e068cf74 Apr 24 19:09:46.978104 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:46.978082 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4"] Apr 24 19:09:46.979810 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:46.979782 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d3fa5b_8237_4810_9120_a6a9421e039b.slice/crio-ac5796b3efa56dfb9da716aa4d8c4bae16766308945e18997dc31485b4fcceca WatchSource:0}: Error finding container ac5796b3efa56dfb9da716aa4d8c4bae16766308945e18997dc31485b4fcceca: Status 404 returned error can't find the container with id ac5796b3efa56dfb9da716aa4d8c4bae16766308945e18997dc31485b4fcceca Apr 24 19:09:47.541783 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:47.541644 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ncvw4" event={"ID":"0844d470-2d66-44fe-8ddb-05c8d01dc2c8","Type":"ContainerStarted","Data":"fb5eae4d93e06d6dabedc5f004c57b83f3694308431445b96e60db2c41f89b18"} Apr 24 19:09:47.542197 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:47.541795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ncvw4" event={"ID":"0844d470-2d66-44fe-8ddb-05c8d01dc2c8","Type":"ContainerStarted","Data":"c429b376ad407c79632b165d64cd5ea21ab741453c935001248375e0e068cf74"} Apr 24 19:09:47.543027 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:47.542984 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" event={"ID":"36d3fa5b-8237-4810-9120-a6a9421e039b","Type":"ContainerStarted","Data":"ac5796b3efa56dfb9da716aa4d8c4bae16766308945e18997dc31485b4fcceca"} Apr 24 19:09:47.928268 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:47.928226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:09:47.931566 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:47.931532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed-metrics-tls\") pod \"dns-default-fgzls\" (UID: \"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed\") " pod="openshift-dns/dns-default-fgzls" Apr 24 19:09:48.029179 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:48.029138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:09:48.032320 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:48.032294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/429311fb-ef10-40c4-958e-de80bbde38f2-cert\") pod \"ingress-canary-d8rzc\" (UID: \"429311fb-ef10-40c4-958e-de80bbde38f2\") " pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:09:48.033128 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:48.033038 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d7j6w\"" Apr 24 19:09:48.039359 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:48.039331 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fgzls" Apr 24 19:09:48.213659 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:48.213536 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fgzls"] Apr 24 19:09:48.547456 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:48.547364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ncvw4" event={"ID":"0844d470-2d66-44fe-8ddb-05c8d01dc2c8","Type":"ContainerStarted","Data":"f613a924db69792e47e0de2e1a5ba9c28147925473778c392a983495048b3004"} Apr 24 19:09:48.584807 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:48.584771 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fa5dcb_318b_41df_bc88_8b9ba1e9d4ed.slice/crio-99d339e71c02d5893d61fb36f9b8591178cbdf07fb705dd822d22c41be919817 WatchSource:0}: Error finding container 99d339e71c02d5893d61fb36f9b8591178cbdf07fb705dd822d22c41be919817: Status 404 returned error can't find the container with id 99d339e71c02d5893d61fb36f9b8591178cbdf07fb705dd822d22c41be919817 Apr 24 19:09:49.552498 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.552453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ncvw4" event={"ID":"0844d470-2d66-44fe-8ddb-05c8d01dc2c8","Type":"ContainerStarted","Data":"42fa599b89b41dee68dca434c2f84cbbcccfc7938198e332035c0608f149db42"} Apr 24 19:09:49.553861 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.553827 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fgzls" event={"ID":"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed","Type":"ContainerStarted","Data":"99d339e71c02d5893d61fb36f9b8591178cbdf07fb705dd822d22c41be919817"} Apr 24 19:09:49.555264 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.555233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" event={"ID":"36d3fa5b-8237-4810-9120-a6a9421e039b","Type":"ContainerStarted","Data":"b942f6186cf42465095ff04031f8ff59dcdebbf126c4520f304a065a39bb7727"} Apr 24 19:09:49.589164 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.589100 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ncvw4" podStartSLOduration=1.433179934 podStartE2EDuration="3.589082878s" podCreationTimestamp="2026-04-24 19:09:46 +0000 UTC" firstStartedPulling="2026-04-24 19:09:46.895649703 +0000 UTC m=+160.365809212" lastFinishedPulling="2026-04-24 19:09:49.051552644 +0000 UTC m=+162.521712156" observedRunningTime="2026-04-24 19:09:49.585778328 +0000 UTC m=+163.055937856" watchObservedRunningTime="2026-04-24 19:09:49.589082878 +0000 UTC m=+163.059242396" Apr 24 19:09:49.622334 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.622277 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4lvn4" podStartSLOduration=33.554475964 podStartE2EDuration="35.622263231s" podCreationTimestamp="2026-04-24 19:09:14 +0000 UTC" firstStartedPulling="2026-04-24 19:09:46.981574119 +0000 UTC m=+160.451733624" lastFinishedPulling="2026-04-24 19:09:49.049361371 +0000 UTC m=+162.519520891" observedRunningTime="2026-04-24 19:09:49.620976103 +0000 UTC m=+163.091135642" watchObservedRunningTime="2026-04-24 19:09:49.622263231 +0000 UTC m=+163.092422758" Apr 24 19:09:49.815450 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.815367 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg"] Apr 24 19:09:49.818714 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.818688 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" Apr 24 19:09:49.840312 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.840278 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 19:09:49.840312 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.840285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-24tpq\"" Apr 24 19:09:49.857932 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.857896 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg"] Apr 24 19:09:49.946597 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:49.946550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/342fbf06-1cb6-4b01-abb5-6eda8b2456eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-p2bjg\" (UID: \"342fbf06-1cb6-4b01-abb5-6eda8b2456eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" Apr 24 19:09:50.047675 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.047635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/342fbf06-1cb6-4b01-abb5-6eda8b2456eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-p2bjg\" (UID: \"342fbf06-1cb6-4b01-abb5-6eda8b2456eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" Apr 24 19:09:50.050494 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.050469 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/342fbf06-1cb6-4b01-abb5-6eda8b2456eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-p2bjg\" (UID: \"342fbf06-1cb6-4b01-abb5-6eda8b2456eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" Apr 24 19:09:50.128551 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.128452 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" Apr 24 19:09:50.286379 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.286349 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg"] Apr 24 19:09:50.288946 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:50.288917 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod342fbf06_1cb6_4b01_abb5_6eda8b2456eb.slice/crio-62088464e88f13fc8cc790b57e7213288dc29aebb46704bab6d92e50e736e1db WatchSource:0}: Error finding container 62088464e88f13fc8cc790b57e7213288dc29aebb46704bab6d92e50e736e1db: Status 404 returned error can't find the container with id 62088464e88f13fc8cc790b57e7213288dc29aebb46704bab6d92e50e736e1db Apr 24 19:09:50.559722 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.559683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fgzls" event={"ID":"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed","Type":"ContainerStarted","Data":"908e33c21064fb3a133fcc07ff44742662beab3f4e71376787ea060788416013"} Apr 24 19:09:50.559722 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.559722 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fgzls" event={"ID":"c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed","Type":"ContainerStarted","Data":"76c5ae027d231720bbb663044bc384d4141e980c5d39f1fd92a32acadb789c7a"} Apr 24 19:09:50.560229 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.559799 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fgzls" Apr 24 19:09:50.560752 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.560729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" event={"ID":"342fbf06-1cb6-4b01-abb5-6eda8b2456eb","Type":"ContainerStarted","Data":"62088464e88f13fc8cc790b57e7213288dc29aebb46704bab6d92e50e736e1db"} Apr 24 19:09:50.590636 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:50.590563 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fgzls" podStartSLOduration=130.010864504 podStartE2EDuration="2m11.590546766s" podCreationTimestamp="2026-04-24 19:07:39 +0000 UTC" firstStartedPulling="2026-04-24 19:09:48.587197514 +0000 UTC m=+162.057357021" lastFinishedPulling="2026-04-24 19:09:50.166879774 +0000 UTC m=+163.637039283" observedRunningTime="2026-04-24 19:09:50.590013966 +0000 UTC m=+164.060173495" watchObservedRunningTime="2026-04-24 19:09:50.590546766 +0000 UTC m=+164.060706288" Apr 24 19:09:51.564924 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:51.564808 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" event={"ID":"342fbf06-1cb6-4b01-abb5-6eda8b2456eb","Type":"ContainerStarted","Data":"91ecffbad1a1e8a0188c232c19b236768deef05d74260402f996a3188105c5e6"} Apr 24 19:09:51.565365 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:51.565150 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" Apr 24 19:09:51.569779 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:51.569755 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" Apr 24 19:09:51.580399 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:51.580354 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-p2bjg" podStartSLOduration=1.5626812860000001 podStartE2EDuration="2.580337304s" podCreationTimestamp="2026-04-24 19:09:49 +0000 UTC" firstStartedPulling="2026-04-24 19:09:50.29098615 +0000 UTC m=+163.761145659" lastFinishedPulling="2026-04-24 19:09:51.308642171 +0000 UTC m=+164.778801677" observedRunningTime="2026-04-24 19:09:51.579076491 +0000 UTC m=+165.049236020" watchObservedRunningTime="2026-04-24 19:09:51.580337304 +0000 UTC m=+165.050496832" Apr 24 19:09:56.115415 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.115318 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:09:56.115909 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.115763 2573 scope.go:117] "RemoveContainer" containerID="317ebe42e0b421445b143661669fc29c7b86d2766e9aee549e82503ac5ec0642" Apr 24 19:09:56.115986 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:56.115955 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-545tt_openshift-console-operator(48af1cfa-4ff5-4543-8939-d43ac71b40ad)\"" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" podUID="48af1cfa-4ff5-4543-8939-d43ac71b40ad" Apr 24 19:09:56.118016 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.117996 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tkcsj\"" Apr 24 19:09:56.126001 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.125965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d8rzc" Apr 24 19:09:56.283950 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.283917 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d8rzc"] Apr 24 19:09:56.287456 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:56.287402 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod429311fb_ef10_40c4_958e_de80bbde38f2.slice/crio-8337f64b902ee52fd4e2f87ca34d621b20825fe1e45c99eb6adede0d5afcd390 WatchSource:0}: Error finding container 8337f64b902ee52fd4e2f87ca34d621b20825fe1e45c99eb6adede0d5afcd390: Status 404 returned error can't find the container with id 8337f64b902ee52fd4e2f87ca34d621b20825fe1e45c99eb6adede0d5afcd390 Apr 24 19:09:56.358853 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.358818 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt"] Apr 24 19:09:56.365175 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.365147 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.370323 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.370254 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 19:09:56.370323 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.370284 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-6mxpv\"" Apr 24 19:09:56.371134 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.371112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 19:09:56.378673 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.378646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 19:09:56.388860 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.388821 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.388860 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.388864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.389078 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.388928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53275301-bce9-4425-9006-0998dc291f4f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.389078 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.388969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxhj7\" (UniqueName: \"kubernetes.io/projected/53275301-bce9-4425-9006-0998dc291f4f-kube-api-access-qxhj7\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.398511 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.398480 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt"] Apr 24 19:09:56.457511 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.457473 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v2jhm"] Apr 24 19:09:56.460689 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.460664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.465891 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.465859 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 19:09:56.466089 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.465899 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 19:09:56.466089 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.465907 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 19:09:56.466089 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.465958 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xnczg\"" Apr 24 19:09:56.489333 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-wtmp\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489333 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489329 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-accelerators-collector-config\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489548 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489353 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-sys\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489548 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a0a60b9f-2691-420d-8541-a3d6737868b5-metrics-client-ca\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489548 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489481 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489548 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489501 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-tls\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489548 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489542 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.489766 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.489766 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53275301-bce9-4425-9006-0998dc291f4f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.489766 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-textfile\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489766 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxw4f\" (UniqueName: \"kubernetes.io/projected/a0a60b9f-2691-420d-8541-a3d6737868b5-kube-api-access-qxw4f\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489766 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxhj7\" (UniqueName: \"kubernetes.io/projected/53275301-bce9-4425-9006-0998dc291f4f-kube-api-access-qxhj7\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.489766 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:56.489691 2573 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 19:09:56.489766 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.489709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-root\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.489766 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:56.489761 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-tls podName:53275301-bce9-4425-9006-0998dc291f4f nodeName:}" failed. No retries permitted until 2026-04-24 19:09:56.989743276 +0000 UTC m=+170.459902784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-n4gvt" (UID: "53275301-bce9-4425-9006-0998dc291f4f") : secret "openshift-state-metrics-tls" not found Apr 24 19:09:56.490324 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.490306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53275301-bce9-4425-9006-0998dc291f4f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.492092 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.492062 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.512910 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.512878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxhj7\" (UniqueName: \"kubernetes.io/projected/53275301-bce9-4425-9006-0998dc291f4f-kube-api-access-qxhj7\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.581090 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.581052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d8rzc" event={"ID":"429311fb-ef10-40c4-958e-de80bbde38f2","Type":"ContainerStarted","Data":"8337f64b902ee52fd4e2f87ca34d621b20825fe1e45c99eb6adede0d5afcd390"} Apr 24 19:09:56.590591 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-textfile\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.590591 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxw4f\" (UniqueName: \"kubernetes.io/projected/a0a60b9f-2691-420d-8541-a3d6737868b5-kube-api-access-qxw4f\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.590864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-root\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.590864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-wtmp\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.590864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-accelerators-collector-config\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.590864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-sys\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.590864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590772 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-root\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.591096 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-wtmp\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.591096 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0a60b9f-2691-420d-8541-a3d6737868b5-sys\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.591096 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a0a60b9f-2691-420d-8541-a3d6737868b5-metrics-client-ca\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.591096 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.590994 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-textfile\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.591096 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.591010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.591096 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.591044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-tls\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.591406 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.591385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a0a60b9f-2691-420d-8541-a3d6737868b5-metrics-client-ca\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.591477 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.591388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-accelerators-collector-config\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.593430 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.593405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.593552 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.593501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a0a60b9f-2691-420d-8541-a3d6737868b5-node-exporter-tls\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.604829 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.604800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxw4f\" (UniqueName: \"kubernetes.io/projected/a0a60b9f-2691-420d-8541-a3d6737868b5-kube-api-access-qxw4f\") pod \"node-exporter-v2jhm\" (UID: \"a0a60b9f-2691-420d-8541-a3d6737868b5\") " pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.774123 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.774092 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v2jhm" Apr 24 19:09:56.783707 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:56.783667 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0a60b9f_2691_420d_8541_a3d6737868b5.slice/crio-bdbcfd35f07699e6f1cce4fbcec0d95287807bee920128cee3a934b87e784609 WatchSource:0}: Error finding container bdbcfd35f07699e6f1cce4fbcec0d95287807bee920128cee3a934b87e784609: Status 404 returned error can't find the container with id bdbcfd35f07699e6f1cce4fbcec0d95287807bee920128cee3a934b87e784609 Apr 24 19:09:56.994917 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.994876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:56.998024 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:56.997985 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/53275301-bce9-4425-9006-0998dc291f4f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n4gvt\" (UID: \"53275301-bce9-4425-9006-0998dc291f4f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:57.275492 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.275166 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" Apr 24 19:09:57.407833 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.407798 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:57.413503 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.412760 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.417589 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.417293 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 19:09:57.420639 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.418507 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 19:09:57.420639 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.418759 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 19:09:57.420639 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.419049 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-w75gw\"" Apr 24 19:09:57.420639 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.419298 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 19:09:57.420639 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.419499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 19:09:57.420639 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.419987 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 19:09:57.420639 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.420251 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 19:09:57.420639 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.420435 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 19:09:57.421139 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.420672 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 19:09:57.433129 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.433058 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:57.456982 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.456940 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt"] Apr 24 19:09:57.497865 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.497829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-web-config\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498088 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.497879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498088 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.497957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-config-out\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498088 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26t2\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-kube-api-access-r26t2\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498088 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498039 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498088 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498349 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498124 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498349 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498159 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498349 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-config-volume\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498349 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498285 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498349 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.498516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.498431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.552806 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:57.552721 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53275301_bce9_4425_9006_0998dc291f4f.slice/crio-4be4b1d19f7e2f71e43505ffb3ad78e1fe9280d169ad2a09e5a92e2c7af62664 WatchSource:0}: Error finding container 4be4b1d19f7e2f71e43505ffb3ad78e1fe9280d169ad2a09e5a92e2c7af62664: Status 404 returned error can't find the container with id 4be4b1d19f7e2f71e43505ffb3ad78e1fe9280d169ad2a09e5a92e2c7af62664 Apr 24 19:09:57.585548 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.585491 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" event={"ID":"53275301-bce9-4425-9006-0998dc291f4f","Type":"ContainerStarted","Data":"4be4b1d19f7e2f71e43505ffb3ad78e1fe9280d169ad2a09e5a92e2c7af62664"} Apr 24 19:09:57.586600 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.586575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v2jhm" event={"ID":"a0a60b9f-2691-420d-8541-a3d6737868b5","Type":"ContainerStarted","Data":"bdbcfd35f07699e6f1cce4fbcec0d95287807bee920128cee3a934b87e784609"} Apr 24 19:09:57.599153 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599153 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-web-config\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599457 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599457 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-config-out\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599457 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r26t2\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-kube-api-access-r26t2\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599457 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599457 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599457 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599457 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599457 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-config-volume\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599511 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599539 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.599864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.599587 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.600790 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:09:57.600439 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle podName:c0adb170-f181-4624-b689-5afa3a246a87 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:58.100417075 +0000 UTC m=+171.570576581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "c0adb170-f181-4624-b689-5afa3a246a87") : configmap references non-existent config key: ca-bundle.crt Apr 24 19:09:57.600790 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.600761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.604009 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.603975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-config-out\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.604494 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.604446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.604705 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.604653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.605129 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.605105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.605675 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.605238 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.605675 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.605292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-config-volume\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.605675 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.605637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.605675 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.605648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.607290 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.607269 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-web-config\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:57.609561 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:57.609534 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26t2\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-kube-api-access-r26t2\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:58.104082 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.104049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:58.105143 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.105109 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:58.115763 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.115740 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:09:58.331517 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.330965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:09:58.468728 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.468694 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c"] Apr 24 19:09:58.473491 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.473465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.480598 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.480452 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 19:09:58.480598 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.480485 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-55hfc\"" Apr 24 19:09:58.480598 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.480452 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 19:09:58.480598 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.480579 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 19:09:58.480887 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.480703 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 19:09:58.480887 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.480821 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 19:09:58.480887 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.480878 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fd2knanio0qao\"" Apr 24 19:09:58.489418 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.489392 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:09:58.491352 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:58.491329 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0adb170_f181_4624_b689_5afa3a246a87.slice/crio-000a61e93a34b681bb3ba8716f2854ba941f8b2856c0a6fa11650b985efe8632 WatchSource:0}: Error finding container 000a61e93a34b681bb3ba8716f2854ba941f8b2856c0a6fa11650b985efe8632: Status 404 returned error can't find the container with id 000a61e93a34b681bb3ba8716f2854ba941f8b2856c0a6fa11650b985efe8632 Apr 24 19:09:58.506127 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.506098 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c"] Apr 24 19:09:58.507023 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.507002 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.507097 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.507032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c18ab70-91cb-4217-b088-7eadecdfc842-metrics-client-ca\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.507097 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.507064 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.507171 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.507102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-grpc-tls\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.507171 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.507139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh8xw\" (UniqueName: \"kubernetes.io/projected/5c18ab70-91cb-4217-b088-7eadecdfc842-kube-api-access-bh8xw\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.507252 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.507193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.507252 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.507232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-tls\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.507326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.507268 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.591490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.591396 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" event={"ID":"53275301-bce9-4425-9006-0998dc291f4f","Type":"ContainerStarted","Data":"8a877916c6ec6bf1f41c91423ca4acaec3b92c32c1347752fe303ffa6d074a33"} Apr 24 19:09:58.591490 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.591439 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" event={"ID":"53275301-bce9-4425-9006-0998dc291f4f","Type":"ContainerStarted","Data":"e1a8f934aef6f5687cadc8e6ff808f92a79ca152c0b473e9e0d3d4681dd25e93"} Apr 24 19:09:58.592981 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.592953 2573 generic.go:358] "Generic (PLEG): container finished" podID="a0a60b9f-2691-420d-8541-a3d6737868b5" containerID="3ed4aa5355af92f2d28eb7763f145bc5deec9749eac95ca843653005ac570e27" exitCode=0 Apr 24 19:09:58.593091 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.593038 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v2jhm" event={"ID":"a0a60b9f-2691-420d-8541-a3d6737868b5","Type":"ContainerDied","Data":"3ed4aa5355af92f2d28eb7763f145bc5deec9749eac95ca843653005ac570e27"} Apr 24 19:09:58.594456 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.594421 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d8rzc" event={"ID":"429311fb-ef10-40c4-958e-de80bbde38f2","Type":"ContainerStarted","Data":"ce457d327ce216af23a316a89ffb6b7f9e26af96658c4162b46fa5d9f33692e4"} Apr 24 19:09:58.595463 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.595443 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerStarted","Data":"000a61e93a34b681bb3ba8716f2854ba941f8b2856c0a6fa11650b985efe8632"} Apr 24 19:09:58.607838 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.607810 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.607940 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.607843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c18ab70-91cb-4217-b088-7eadecdfc842-metrics-client-ca\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.607940 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.607887 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.607940 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.607904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-grpc-tls\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.607940 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.607924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh8xw\" (UniqueName: \"kubernetes.io/projected/5c18ab70-91cb-4217-b088-7eadecdfc842-kube-api-access-bh8xw\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.608119 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.607960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.608119 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.608013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-tls\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.608119 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.608085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.608745 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.608721 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c18ab70-91cb-4217-b088-7eadecdfc842-metrics-client-ca\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.610981 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.610954 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.611087 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.611032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.611278 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.611255 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.611396 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.611265 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-grpc-tls\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.611471 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.611443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.611576 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.611532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5c18ab70-91cb-4217-b088-7eadecdfc842-secret-thanos-querier-tls\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.618381 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.618356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh8xw\" (UniqueName: \"kubernetes.io/projected/5c18ab70-91cb-4217-b088-7eadecdfc842-kube-api-access-bh8xw\") pod \"thanos-querier-7fd9f79f69-q7q9c\" (UID: \"5c18ab70-91cb-4217-b088-7eadecdfc842\") " pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.627378 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.627298 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-d8rzc" podStartSLOduration=137.815113008 podStartE2EDuration="2m19.627280269s" podCreationTimestamp="2026-04-24 19:07:39 +0000 UTC" firstStartedPulling="2026-04-24 19:09:56.289810812 +0000 UTC m=+169.759970318" lastFinishedPulling="2026-04-24 19:09:58.101978073 +0000 UTC m=+171.572137579" observedRunningTime="2026-04-24 19:09:58.626374277 +0000 UTC m=+172.096533804" watchObservedRunningTime="2026-04-24 19:09:58.627280269 +0000 UTC m=+172.097439801" Apr 24 19:09:58.783813 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.783764 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:09:58.931907 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:58.931873 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c"] Apr 24 19:09:59.127846 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:09:59.127766 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c18ab70_91cb_4217_b088_7eadecdfc842.slice/crio-00559d90bba89f0187a1714d392c6968ee4d6a7721b370300c004c19a16e9c34 WatchSource:0}: Error finding container 00559d90bba89f0187a1714d392c6968ee4d6a7721b370300c004c19a16e9c34: Status 404 returned error can't find the container with id 00559d90bba89f0187a1714d392c6968ee4d6a7721b370300c004c19a16e9c34 Apr 24 19:09:59.601304 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:59.601264 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" event={"ID":"53275301-bce9-4425-9006-0998dc291f4f","Type":"ContainerStarted","Data":"f559812991b807af0a4a09a40796cd4bc3863efb324010d1f384bdf2e9747a91"} Apr 24 19:09:59.603708 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:59.603675 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v2jhm" event={"ID":"a0a60b9f-2691-420d-8541-a3d6737868b5","Type":"ContainerStarted","Data":"479422139cc978272c027cf8bebc6d11d94c2411ca1ee1653739b09fcde3801f"} Apr 24 19:09:59.603830 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:59.603713 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v2jhm" event={"ID":"a0a60b9f-2691-420d-8541-a3d6737868b5","Type":"ContainerStarted","Data":"8ced83f0aa1fd594bc08adbadaa41a633d51a4ba90ea339cb8b120eedad5507b"} Apr 24 19:09:59.605031 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:59.604998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" event={"ID":"5c18ab70-91cb-4217-b088-7eadecdfc842","Type":"ContainerStarted","Data":"00559d90bba89f0187a1714d392c6968ee4d6a7721b370300c004c19a16e9c34"} Apr 24 19:09:59.606380 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:59.606354 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0adb170-f181-4624-b689-5afa3a246a87" containerID="bcf64d3ecdc3c0cd3fe5af9973012a06779f851fc4a564504eff5cdeb19b63a5" exitCode=0 Apr 24 19:09:59.606481 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:59.606417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerDied","Data":"bcf64d3ecdc3c0cd3fe5af9973012a06779f851fc4a564504eff5cdeb19b63a5"} Apr 24 19:09:59.633752 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:59.633700 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n4gvt" podStartSLOduration=2.44025571 podStartE2EDuration="3.633682845s" podCreationTimestamp="2026-04-24 19:09:56 +0000 UTC" firstStartedPulling="2026-04-24 19:09:58.229334032 +0000 UTC m=+171.699493542" lastFinishedPulling="2026-04-24 19:09:59.422761168 +0000 UTC m=+172.892920677" observedRunningTime="2026-04-24 19:09:59.63348049 +0000 UTC m=+173.103640039" watchObservedRunningTime="2026-04-24 19:09:59.633682845 +0000 UTC m=+173.103842373" Apr 24 19:09:59.671091 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:09:59.670975 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v2jhm" podStartSLOduration=2.359502595 podStartE2EDuration="3.670956095s" podCreationTimestamp="2026-04-24 19:09:56 +0000 UTC" firstStartedPulling="2026-04-24 19:09:56.785545646 +0000 UTC m=+170.255705151" lastFinishedPulling="2026-04-24 19:09:58.09699914 +0000 UTC m=+171.567158651" observedRunningTime="2026-04-24 19:09:59.669390862 +0000 UTC m=+173.139550403" watchObservedRunningTime="2026-04-24 19:09:59.670956095 +0000 UTC m=+173.141115623" Apr 24 19:10:00.567511 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.567479 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fgzls" Apr 24 19:10:00.862952 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.862857 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-f75f4c864-nlvk8"] Apr 24 19:10:00.866298 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.866266 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:00.870661 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.870633 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 19:10:00.870827 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.870633 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 19:10:00.871475 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.871446 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-7qngg\"" Apr 24 19:10:00.871591 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.871506 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 19:10:00.871591 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.871565 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9kl544f8ff0fh\"" Apr 24 19:10:00.871591 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.871513 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 19:10:00.881141 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.881107 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f75f4c864-nlvk8"] Apr 24 19:10:00.931657 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.931584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/14a50468-3433-4a11-a3a5-017511ea1ead-metrics-server-audit-profiles\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:00.931827 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.931744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14a50468-3433-4a11-a3a5-017511ea1ead-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:00.931965 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.931831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/14a50468-3433-4a11-a3a5-017511ea1ead-audit-log\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:00.931965 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.931904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-secret-metrics-server-tls\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:00.931965 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.931941 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-client-ca-bundle\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:00.932115 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.931977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-secret-metrics-server-client-certs\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:00.932115 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:00.932022 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lztgp\" (UniqueName: \"kubernetes.io/projected/14a50468-3433-4a11-a3a5-017511ea1ead-kube-api-access-lztgp\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.032725 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.032679 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/14a50468-3433-4a11-a3a5-017511ea1ead-metrics-server-audit-profiles\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.032923 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.032771 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14a50468-3433-4a11-a3a5-017511ea1ead-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.032923 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.032801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/14a50468-3433-4a11-a3a5-017511ea1ead-audit-log\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.032923 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.032841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-secret-metrics-server-tls\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.032923 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.032888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-client-ca-bundle\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.032923 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.032919 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-secret-metrics-server-client-certs\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.033136 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.032948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lztgp\" (UniqueName: \"kubernetes.io/projected/14a50468-3433-4a11-a3a5-017511ea1ead-kube-api-access-lztgp\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.033466 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.033423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/14a50468-3433-4a11-a3a5-017511ea1ead-audit-log\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.033827 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.033801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/14a50468-3433-4a11-a3a5-017511ea1ead-metrics-server-audit-profiles\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.034207 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.034180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14a50468-3433-4a11-a3a5-017511ea1ead-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.035995 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.035973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-secret-metrics-server-tls\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.036101 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.036076 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-secret-metrics-server-client-certs\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.036729 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.036696 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a50468-3433-4a11-a3a5-017511ea1ead-client-ca-bundle\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.037715 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.037694 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5"] Apr 24 19:10:01.042229 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.042205 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" Apr 24 19:10:01.044689 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.044595 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 19:10:01.047783 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.047754 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-m4dr4\"" Apr 24 19:10:01.049645 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.049597 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lztgp\" (UniqueName: \"kubernetes.io/projected/14a50468-3433-4a11-a3a5-017511ea1ead-kube-api-access-lztgp\") pod \"metrics-server-f75f4c864-nlvk8\" (UID: \"14a50468-3433-4a11-a3a5-017511ea1ead\") " pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.055339 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.055303 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5"] Apr 24 19:10:01.133731 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.133643 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a82aef2a-3335-40f6-8483-d1fd1479a47a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fghl5\" (UID: \"a82aef2a-3335-40f6-8483-d1fd1479a47a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" Apr 24 19:10:01.178642 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.178581 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:01.234904 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.234863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a82aef2a-3335-40f6-8483-d1fd1479a47a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fghl5\" (UID: \"a82aef2a-3335-40f6-8483-d1fd1479a47a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" Apr 24 19:10:01.237667 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.237629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a82aef2a-3335-40f6-8483-d1fd1479a47a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fghl5\" (UID: \"a82aef2a-3335-40f6-8483-d1fd1479a47a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" Apr 24 19:10:01.367066 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.367036 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" Apr 24 19:10:01.510375 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.510337 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f75f4c864-nlvk8"] Apr 24 19:10:01.515572 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:10:01.515529 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a50468_3433_4a11_a3a5_017511ea1ead.slice/crio-c38d3fbf7363e4979bf947f88a36bed8052d6a65f8afe6c609da18996aa3a978 WatchSource:0}: Error finding container c38d3fbf7363e4979bf947f88a36bed8052d6a65f8afe6c609da18996aa3a978: Status 404 returned error can't find the container with id c38d3fbf7363e4979bf947f88a36bed8052d6a65f8afe6c609da18996aa3a978 Apr 24 19:10:01.539864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.539794 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5"] Apr 24 19:10:01.543292 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:10:01.543248 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda82aef2a_3335_40f6_8483_d1fd1479a47a.slice/crio-2682448d01191bd5807623a026c3a964b24eb50a556b518763e99e8bb7e624e5 WatchSource:0}: Error finding container 2682448d01191bd5807623a026c3a964b24eb50a556b518763e99e8bb7e624e5: Status 404 returned error can't find the container with id 2682448d01191bd5807623a026c3a964b24eb50a556b518763e99e8bb7e624e5 Apr 24 19:10:01.618862 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.618751 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" event={"ID":"5c18ab70-91cb-4217-b088-7eadecdfc842","Type":"ContainerStarted","Data":"693972dacdca9c202f80d7c197cb75f08c2b88d40d969c0565d6464deda11565"} Apr 24 19:10:01.618862 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.618791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" event={"ID":"5c18ab70-91cb-4217-b088-7eadecdfc842","Type":"ContainerStarted","Data":"9b506674ea2ff31bc8117ce80b751d45303aca376e9d0271b1f6236a32108323"} Apr 24 19:10:01.618862 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.618800 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" event={"ID":"5c18ab70-91cb-4217-b088-7eadecdfc842","Type":"ContainerStarted","Data":"33fcd0094a87ba4f712e3ee8e5d6d1f607f555b8f2cae12d19de02d86b803cde"} Apr 24 19:10:01.621662 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.621628 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerStarted","Data":"6dac3a24ffc401d6ec34d7b854119caa834f6e94047689fda06335d86972c5d5"} Apr 24 19:10:01.621787 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.621672 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerStarted","Data":"3558f8c5968722c483b292729fff2c143ca1277c7140e403c41795d75ba498ea"} Apr 24 19:10:01.621787 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.621688 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerStarted","Data":"9978eeacad4b874f3d187fb551afc47347a5160fbbb5395eadc82086f0a30588"} Apr 24 19:10:01.623711 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.623665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" event={"ID":"a82aef2a-3335-40f6-8483-d1fd1479a47a","Type":"ContainerStarted","Data":"2682448d01191bd5807623a026c3a964b24eb50a556b518763e99e8bb7e624e5"} Apr 24 19:10:01.624856 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:01.624817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" event={"ID":"14a50468-3433-4a11-a3a5-017511ea1ead","Type":"ContainerStarted","Data":"c38d3fbf7363e4979bf947f88a36bed8052d6a65f8afe6c609da18996aa3a978"} Apr 24 19:10:02.633793 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:02.633742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerStarted","Data":"94f3d0e6bf5efddc90e4fa5ec12e590eb2abb89a1693643e2c72d6902d4a2ed5"} Apr 24 19:10:02.633793 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:02.633795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerStarted","Data":"72bd3404501c7332f367dd3958d4fca375a46b121b092a2c9c6abf58d1a2b1aa"} Apr 24 19:10:03.640306 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.640260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" event={"ID":"5c18ab70-91cb-4217-b088-7eadecdfc842","Type":"ContainerStarted","Data":"2a8e5004f5d6675fec7a6d6f8284eaba8f49cda7ba718c3a10e1a170bbada4ed"} Apr 24 19:10:03.640306 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.640310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" event={"ID":"5c18ab70-91cb-4217-b088-7eadecdfc842","Type":"ContainerStarted","Data":"767bfb879d6f89d0fb4a30853be6f1dca4f9fae162df65355acd5f97aa36d81f"} Apr 24 19:10:03.640859 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.640325 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" event={"ID":"5c18ab70-91cb-4217-b088-7eadecdfc842","Type":"ContainerStarted","Data":"1208d15e398fcd69018ead4d96d3732ffbbfcf0399acd741806a10b25c240853"} Apr 24 19:10:03.640859 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.640458 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:10:03.644056 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.644028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerStarted","Data":"6c3db2f31169f389829359cc6a01ba1b23fa0fe8833ccd17b0f06908034aabe1"} Apr 24 19:10:03.645528 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.645500 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" event={"ID":"a82aef2a-3335-40f6-8483-d1fd1479a47a","Type":"ContainerStarted","Data":"4e89e17a7a0ffc4152b9208829b65c728bda31a05a48ffc8d51e4dab3f213fcb"} Apr 24 19:10:03.645727 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.645713 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" Apr 24 19:10:03.647129 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.647105 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" event={"ID":"14a50468-3433-4a11-a3a5-017511ea1ead","Type":"ContainerStarted","Data":"97203435e1b6b704aac9ac8baf3e8ad716c700ced5ad191d21add81f75f08bca"} Apr 24 19:10:03.651449 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.651430 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" Apr 24 19:10:03.668144 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.668082 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" podStartSLOduration=1.445999608 podStartE2EDuration="5.668061371s" podCreationTimestamp="2026-04-24 19:09:58 +0000 UTC" firstStartedPulling="2026-04-24 19:09:59.129773232 +0000 UTC m=+172.599932741" lastFinishedPulling="2026-04-24 19:10:03.351834983 +0000 UTC m=+176.821994504" observedRunningTime="2026-04-24 19:10:03.665846321 +0000 UTC m=+177.136005850" watchObservedRunningTime="2026-04-24 19:10:03.668061371 +0000 UTC m=+177.138220900" Apr 24 19:10:03.698458 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.698404 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.841812856 podStartE2EDuration="6.698385644s" podCreationTimestamp="2026-04-24 19:09:57 +0000 UTC" firstStartedPulling="2026-04-24 19:09:58.493212489 +0000 UTC m=+171.963371999" lastFinishedPulling="2026-04-24 19:10:03.349785271 +0000 UTC m=+176.819944787" observedRunningTime="2026-04-24 19:10:03.695669667 +0000 UTC m=+177.165829197" watchObservedRunningTime="2026-04-24 19:10:03.698385644 +0000 UTC m=+177.168545173" Apr 24 19:10:03.723760 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:03.723699 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" podStartSLOduration=1.894897692 podStartE2EDuration="3.723683948s" podCreationTimestamp="2026-04-24 19:10:00 +0000 UTC" firstStartedPulling="2026-04-24 19:10:01.520808857 +0000 UTC m=+174.990968364" lastFinishedPulling="2026-04-24 19:10:03.34959511 +0000 UTC m=+176.819754620" observedRunningTime="2026-04-24 19:10:03.722407575 +0000 UTC m=+177.192567103" watchObservedRunningTime="2026-04-24 19:10:03.723683948 +0000 UTC m=+177.193843478" Apr 24 19:10:09.656898 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:09.656864 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7fd9f79f69-q7q9c" Apr 24 19:10:09.687413 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:09.687363 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fghl5" podStartSLOduration=6.884961174 podStartE2EDuration="8.687350613s" podCreationTimestamp="2026-04-24 19:10:01 +0000 UTC" firstStartedPulling="2026-04-24 19:10:01.547175304 +0000 UTC m=+175.017334812" lastFinishedPulling="2026-04-24 19:10:03.349564717 +0000 UTC m=+176.819724251" observedRunningTime="2026-04-24 19:10:03.748543181 +0000 UTC m=+177.218702699" watchObservedRunningTime="2026-04-24 19:10:09.687350613 +0000 UTC m=+183.157510140" Apr 24 19:10:10.116156 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.116125 2573 scope.go:117] "RemoveContainer" containerID="317ebe42e0b421445b143661669fc29c7b86d2766e9aee549e82503ac5ec0642" Apr 24 19:10:10.668627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.668595 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:10:10.668977 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.668671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" event={"ID":"48af1cfa-4ff5-4543-8939-d43ac71b40ad","Type":"ContainerStarted","Data":"6239ec35ab37fc3089ccf1d9bec40c3eaed1991c7788cd0816725d4b958356c4"} Apr 24 19:10:10.668977 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.668937 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:10:10.673648 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.673626 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" Apr 24 19:10:10.695265 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.695218 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-545tt" podStartSLOduration=55.003562641 podStartE2EDuration="56.695203205s" podCreationTimestamp="2026-04-24 19:09:14 +0000 UTC" firstStartedPulling="2026-04-24 19:09:15.236740734 +0000 UTC m=+128.706900243" lastFinishedPulling="2026-04-24 19:09:16.928381289 +0000 UTC m=+130.398540807" observedRunningTime="2026-04-24 19:10:10.694382015 +0000 UTC m=+184.164541543" watchObservedRunningTime="2026-04-24 19:10:10.695203205 +0000 UTC m=+184.165362733" Apr 24 19:10:10.973384 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.973349 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-s5jkh"] Apr 24 19:10:10.976904 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.976882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s5jkh" Apr 24 19:10:10.998995 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.998971 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 19:10:10.999152 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.999135 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 19:10:10.999246 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:10.999228 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-hsb9h\"" Apr 24 19:10:11.006521 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:11.006491 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s5jkh"] Apr 24 19:10:11.030298 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:11.030273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66b9\" (UniqueName: \"kubernetes.io/projected/5ecc46dd-f655-4f16-9e14-5494a657924a-kube-api-access-w66b9\") pod \"downloads-6bcc868b7-s5jkh\" (UID: \"5ecc46dd-f655-4f16-9e14-5494a657924a\") " pod="openshift-console/downloads-6bcc868b7-s5jkh" Apr 24 19:10:11.131680 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:11.131644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w66b9\" (UniqueName: \"kubernetes.io/projected/5ecc46dd-f655-4f16-9e14-5494a657924a-kube-api-access-w66b9\") pod \"downloads-6bcc868b7-s5jkh\" (UID: \"5ecc46dd-f655-4f16-9e14-5494a657924a\") " pod="openshift-console/downloads-6bcc868b7-s5jkh" Apr 24 19:10:11.141355 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:11.141329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66b9\" (UniqueName: \"kubernetes.io/projected/5ecc46dd-f655-4f16-9e14-5494a657924a-kube-api-access-w66b9\") pod \"downloads-6bcc868b7-s5jkh\" (UID: \"5ecc46dd-f655-4f16-9e14-5494a657924a\") " pod="openshift-console/downloads-6bcc868b7-s5jkh" Apr 24 19:10:11.285984 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:11.285901 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s5jkh" Apr 24 19:10:11.423577 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:11.423556 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s5jkh"] Apr 24 19:10:11.426062 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:10:11.426024 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ecc46dd_f655_4f16_9e14_5494a657924a.slice/crio-95fa60a83bd147e624ff1985a071222bcf5580e82eda9703aab9363659d8369d WatchSource:0}: Error finding container 95fa60a83bd147e624ff1985a071222bcf5580e82eda9703aab9363659d8369d: Status 404 returned error can't find the container with id 95fa60a83bd147e624ff1985a071222bcf5580e82eda9703aab9363659d8369d Apr 24 19:10:11.672815 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:11.672741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s5jkh" event={"ID":"5ecc46dd-f655-4f16-9e14-5494a657924a","Type":"ContainerStarted","Data":"95fa60a83bd147e624ff1985a071222bcf5580e82eda9703aab9363659d8369d"} Apr 24 19:10:16.388645 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.388595 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5797597577-tqcq5"] Apr 24 19:10:16.392203 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.392179 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.394801 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.394781 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vw6zx\"" Apr 24 19:10:16.394912 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.394806 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 19:10:16.394912 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.394893 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 19:10:16.395730 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.395659 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 19:10:16.395820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.395768 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 19:10:16.395820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.395775 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 19:10:16.405330 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.405300 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5797597577-tqcq5"] Apr 24 19:10:16.479825 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.479790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtlkt\" (UniqueName: \"kubernetes.io/projected/4fd7b7e0-171a-453f-bf0c-97984389dc91-kube-api-access-mtlkt\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.479976 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.479837 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-oauth-serving-cert\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.479976 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.479875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-serving-cert\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.479976 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.479945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-oauth-config\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.480120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.480006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-config\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.480120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.480050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-service-ca\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.580756 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.580722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-oauth-serving-cert\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.580945 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.580775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-serving-cert\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.580945 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.580813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-oauth-config\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.580945 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.580856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-config\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.580945 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.580889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-service-ca\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.581159 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.580966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkt\" (UniqueName: \"kubernetes.io/projected/4fd7b7e0-171a-453f-bf0c-97984389dc91-kube-api-access-mtlkt\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.581586 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.581515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-oauth-serving-cert\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.581779 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.581741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-config\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.581923 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.581896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-service-ca\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.583771 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.583751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-serving-cert\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.583858 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.583820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-oauth-config\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.589482 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.589457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtlkt\" (UniqueName: \"kubernetes.io/projected/4fd7b7e0-171a-453f-bf0c-97984389dc91-kube-api-access-mtlkt\") pod \"console-5797597577-tqcq5\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.703685 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.703513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:16.840166 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:16.840133 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5797597577-tqcq5"] Apr 24 19:10:16.844245 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:10:16.844218 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fd7b7e0_171a_453f_bf0c_97984389dc91.slice/crio-4acdedb97584c0733b4ed31466dd6d29a858726693e740f5c6bce19bbe8004ae WatchSource:0}: Error finding container 4acdedb97584c0733b4ed31466dd6d29a858726693e740f5c6bce19bbe8004ae: Status 404 returned error can't find the container with id 4acdedb97584c0733b4ed31466dd6d29a858726693e740f5c6bce19bbe8004ae Apr 24 19:10:17.695523 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:17.695482 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5797597577-tqcq5" event={"ID":"4fd7b7e0-171a-453f-bf0c-97984389dc91","Type":"ContainerStarted","Data":"4acdedb97584c0733b4ed31466dd6d29a858726693e740f5c6bce19bbe8004ae"} Apr 24 19:10:20.707383 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:20.707347 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5797597577-tqcq5" event={"ID":"4fd7b7e0-171a-453f-bf0c-97984389dc91","Type":"ContainerStarted","Data":"dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746"} Apr 24 19:10:20.728674 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:20.728621 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5797597577-tqcq5" podStartSLOduration=1.7252638120000001 podStartE2EDuration="4.728589277s" podCreationTimestamp="2026-04-24 19:10:16 +0000 UTC" firstStartedPulling="2026-04-24 19:10:16.846398401 +0000 UTC m=+190.316557906" lastFinishedPulling="2026-04-24 19:10:19.849723855 +0000 UTC m=+193.319883371" observedRunningTime="2026-04-24 19:10:20.728046595 +0000 UTC m=+194.198206122" watchObservedRunningTime="2026-04-24 19:10:20.728589277 +0000 UTC m=+194.198748805" Apr 24 19:10:21.179475 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:21.179434 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:21.179745 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:21.179484 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:26.704506 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.704461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:26.705017 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.704580 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:26.710307 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.710276 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:26.732220 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.732176 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:10:26.798693 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.798655 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85c89f5d9d-jbs9q"] Apr 24 19:10:26.801624 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.801575 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.810088 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.810058 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 19:10:26.815475 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.815444 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85c89f5d9d-jbs9q"] Apr 24 19:10:26.877174 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.877141 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-service-ca\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.877374 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.877249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-oauth-serving-cert\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.877374 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.877346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-serving-cert\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.877509 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.877400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-config\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.877509 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.877447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcv2n\" (UniqueName: \"kubernetes.io/projected/e37b41e3-bd25-4312-8c8a-e590bba67bd6-kube-api-access-zcv2n\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.877633 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.877522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-oauth-config\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.877633 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.877538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-trusted-ca-bundle\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.978571 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.978466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-service-ca\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.978571 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.978535 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-oauth-serving-cert\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.978824 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.978582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-serving-cert\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.978824 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.978673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-config\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.978824 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.978726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcv2n\" (UniqueName: \"kubernetes.io/projected/e37b41e3-bd25-4312-8c8a-e590bba67bd6-kube-api-access-zcv2n\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.978824 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.978782 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-oauth-config\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.978824 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.978808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-trusted-ca-bundle\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.979382 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.979359 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-service-ca\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.979476 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.979430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-oauth-serving-cert\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.979817 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.979775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-config\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.980696 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.980670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-trusted-ca-bundle\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.981928 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.981886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-oauth-config\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.982288 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.982265 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-serving-cert\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:26.987480 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:26.987450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcv2n\" (UniqueName: \"kubernetes.io/projected/e37b41e3-bd25-4312-8c8a-e590bba67bd6-kube-api-access-zcv2n\") pod \"console-85c89f5d9d-jbs9q\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:27.116422 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.116374 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:27.248526 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.248486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85c89f5d9d-jbs9q"] Apr 24 19:10:27.252375 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:10:27.252341 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37b41e3_bd25_4312_8c8a_e590bba67bd6.slice/crio-fa885374d116af942fa95a1651807d3a8d4955774ac23b651c00f2526fcaeb00 WatchSource:0}: Error finding container fa885374d116af942fa95a1651807d3a8d4955774ac23b651c00f2526fcaeb00: Status 404 returned error can't find the container with id fa885374d116af942fa95a1651807d3a8d4955774ac23b651c00f2526fcaeb00 Apr 24 19:10:27.732939 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.732900 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85c89f5d9d-jbs9q" event={"ID":"e37b41e3-bd25-4312-8c8a-e590bba67bd6","Type":"ContainerStarted","Data":"4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14"} Apr 24 19:10:27.733404 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.733233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85c89f5d9d-jbs9q" event={"ID":"e37b41e3-bd25-4312-8c8a-e590bba67bd6","Type":"ContainerStarted","Data":"fa885374d116af942fa95a1651807d3a8d4955774ac23b651c00f2526fcaeb00"} Apr 24 19:10:27.734697 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.734665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s5jkh" event={"ID":"5ecc46dd-f655-4f16-9e14-5494a657924a","Type":"ContainerStarted","Data":"37efc2b39aea45244290e2168dcff6ea8f64183f027ccacd17e282ecf1f0eb29"} Apr 24 19:10:27.734882 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.734863 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-s5jkh" Apr 24 19:10:27.752242 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.752182 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85c89f5d9d-jbs9q" podStartSLOduration=1.752161283 podStartE2EDuration="1.752161283s" podCreationTimestamp="2026-04-24 19:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:10:27.750349926 +0000 UTC m=+201.220509469" watchObservedRunningTime="2026-04-24 19:10:27.752161283 +0000 UTC m=+201.222320813" Apr 24 19:10:27.756790 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.756757 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-s5jkh" Apr 24 19:10:27.768411 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:27.768352 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-s5jkh" podStartSLOduration=1.999931724 podStartE2EDuration="17.768333193s" podCreationTimestamp="2026-04-24 19:10:10 +0000 UTC" firstStartedPulling="2026-04-24 19:10:11.427920956 +0000 UTC m=+184.898080462" lastFinishedPulling="2026-04-24 19:10:27.196322412 +0000 UTC m=+200.666481931" observedRunningTime="2026-04-24 19:10:27.766574767 +0000 UTC m=+201.236734308" watchObservedRunningTime="2026-04-24 19:10:27.768333193 +0000 UTC m=+201.238492720" Apr 24 19:10:37.120064 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:37.120036 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:37.120714 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:37.120073 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:37.122776 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:37.122752 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:37.774770 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:37.774736 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:10:37.820167 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:37.820125 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5797597577-tqcq5"] Apr 24 19:10:41.184217 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:41.184187 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:10:41.188429 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:10:41.188406 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-f75f4c864-nlvk8" Apr 24 19:11:02.841030 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:02.840967 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5797597577-tqcq5" podUID="4fd7b7e0-171a-453f-bf0c-97984389dc91" containerName="console" containerID="cri-o://dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746" gracePeriod=15 Apr 24 19:11:03.135315 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.135290 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5797597577-tqcq5_4fd7b7e0-171a-453f-bf0c-97984389dc91/console/0.log" Apr 24 19:11:03.135477 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.135368 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:11:03.216529 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.216495 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-serving-cert\") pod \"4fd7b7e0-171a-453f-bf0c-97984389dc91\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " Apr 24 19:11:03.216529 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.216540 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtlkt\" (UniqueName: \"kubernetes.io/projected/4fd7b7e0-171a-453f-bf0c-97984389dc91-kube-api-access-mtlkt\") pod \"4fd7b7e0-171a-453f-bf0c-97984389dc91\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " Apr 24 19:11:03.216845 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.216591 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-service-ca\") pod \"4fd7b7e0-171a-453f-bf0c-97984389dc91\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " Apr 24 19:11:03.216845 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.216647 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-oauth-serving-cert\") pod \"4fd7b7e0-171a-453f-bf0c-97984389dc91\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " Apr 24 19:11:03.216845 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.216715 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-oauth-config\") pod \"4fd7b7e0-171a-453f-bf0c-97984389dc91\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " Apr 24 19:11:03.216845 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.216749 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-config\") pod \"4fd7b7e0-171a-453f-bf0c-97984389dc91\" (UID: \"4fd7b7e0-171a-453f-bf0c-97984389dc91\") " Apr 24 19:11:03.217148 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.217118 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4fd7b7e0-171a-453f-bf0c-97984389dc91" (UID: "4fd7b7e0-171a-453f-bf0c-97984389dc91"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:03.217202 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.217157 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-config" (OuterVolumeSpecName: "console-config") pod "4fd7b7e0-171a-453f-bf0c-97984389dc91" (UID: "4fd7b7e0-171a-453f-bf0c-97984389dc91"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:03.217572 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.217541 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-service-ca" (OuterVolumeSpecName: "service-ca") pod "4fd7b7e0-171a-453f-bf0c-97984389dc91" (UID: "4fd7b7e0-171a-453f-bf0c-97984389dc91"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:03.219107 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.219076 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4fd7b7e0-171a-453f-bf0c-97984389dc91" (UID: "4fd7b7e0-171a-453f-bf0c-97984389dc91"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:03.219340 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.219313 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd7b7e0-171a-453f-bf0c-97984389dc91-kube-api-access-mtlkt" (OuterVolumeSpecName: "kube-api-access-mtlkt") pod "4fd7b7e0-171a-453f-bf0c-97984389dc91" (UID: "4fd7b7e0-171a-453f-bf0c-97984389dc91"). InnerVolumeSpecName "kube-api-access-mtlkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:11:03.219340 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.219313 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4fd7b7e0-171a-453f-bf0c-97984389dc91" (UID: "4fd7b7e0-171a-453f-bf0c-97984389dc91"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:03.317555 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.317519 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-oauth-serving-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:03.317555 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.317552 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-oauth-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:03.317555 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.317562 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:03.317802 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.317572 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd7b7e0-171a-453f-bf0c-97984389dc91-console-serving-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:03.317802 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.317581 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtlkt\" (UniqueName: \"kubernetes.io/projected/4fd7b7e0-171a-453f-bf0c-97984389dc91-kube-api-access-mtlkt\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:03.317802 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.317590 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fd7b7e0-171a-453f-bf0c-97984389dc91-service-ca\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:03.861071 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.861045 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5797597577-tqcq5_4fd7b7e0-171a-453f-bf0c-97984389dc91/console/0.log" Apr 24 19:11:03.861495 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.861084 2573 generic.go:358] "Generic (PLEG): container finished" podID="4fd7b7e0-171a-453f-bf0c-97984389dc91" containerID="dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746" exitCode=2 Apr 24 19:11:03.861495 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.861120 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5797597577-tqcq5" event={"ID":"4fd7b7e0-171a-453f-bf0c-97984389dc91","Type":"ContainerDied","Data":"dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746"} Apr 24 19:11:03.861495 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.861165 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5797597577-tqcq5" Apr 24 19:11:03.861495 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.861170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5797597577-tqcq5" event={"ID":"4fd7b7e0-171a-453f-bf0c-97984389dc91","Type":"ContainerDied","Data":"4acdedb97584c0733b4ed31466dd6d29a858726693e740f5c6bce19bbe8004ae"} Apr 24 19:11:03.861495 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.861191 2573 scope.go:117] "RemoveContainer" containerID="dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746" Apr 24 19:11:03.870221 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.870202 2573 scope.go:117] "RemoveContainer" containerID="dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746" Apr 24 19:11:03.870549 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:11:03.870527 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746\": container with ID starting with dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746 not found: ID does not exist" containerID="dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746" Apr 24 19:11:03.870601 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.870560 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746"} err="failed to get container status \"dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746\": rpc error: code = NotFound desc = could not find container \"dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746\": container with ID starting with dcee464ed28bbf4aecccfea4e2f7b0e4a86d6f3d9b535938c267e3881b872746 not found: ID does not exist" Apr 24 19:11:03.883932 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.883887 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5797597577-tqcq5"] Apr 24 19:11:03.890084 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:03.890048 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5797597577-tqcq5"] Apr 24 19:11:05.119821 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:05.119779 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd7b7e0-171a-453f-bf0c-97984389dc91" path="/var/lib/kubelet/pods/4fd7b7e0-171a-453f-bf0c-97984389dc91/volumes" Apr 24 19:11:16.768970 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.768899 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:11:16.770146 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.770067 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="alertmanager" containerID="cri-o://9978eeacad4b874f3d187fb551afc47347a5160fbbb5395eadc82086f0a30588" gracePeriod=120 Apr 24 19:11:16.772197 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.770369 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy" containerID="cri-o://72bd3404501c7332f367dd3958d4fca375a46b121b092a2c9c6abf58d1a2b1aa" gracePeriod=120 Apr 24 19:11:16.772197 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.770555 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy-web" containerID="cri-o://6dac3a24ffc401d6ec34d7b854119caa834f6e94047689fda06335d86972c5d5" gracePeriod=120 Apr 24 19:11:16.772197 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.770567 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy-metric" containerID="cri-o://94f3d0e6bf5efddc90e4fa5ec12e590eb2abb89a1693643e2c72d6902d4a2ed5" gracePeriod=120 Apr 24 19:11:16.772197 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.770520 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="prom-label-proxy" containerID="cri-o://6c3db2f31169f389829359cc6a01ba1b23fa0fe8833ccd17b0f06908034aabe1" gracePeriod=120 Apr 24 19:11:16.772197 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.770589 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="config-reloader" containerID="cri-o://3558f8c5968722c483b292729fff2c143ca1277c7140e403c41795d75ba498ea" gracePeriod=120 Apr 24 19:11:16.903930 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.903897 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0adb170-f181-4624-b689-5afa3a246a87" containerID="6c3db2f31169f389829359cc6a01ba1b23fa0fe8833ccd17b0f06908034aabe1" exitCode=0 Apr 24 19:11:16.903930 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.903922 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0adb170-f181-4624-b689-5afa3a246a87" containerID="72bd3404501c7332f367dd3958d4fca375a46b121b092a2c9c6abf58d1a2b1aa" exitCode=0 Apr 24 19:11:16.903930 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.903929 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0adb170-f181-4624-b689-5afa3a246a87" containerID="3558f8c5968722c483b292729fff2c143ca1277c7140e403c41795d75ba498ea" exitCode=0 Apr 24 19:11:16.903930 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.903935 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0adb170-f181-4624-b689-5afa3a246a87" containerID="9978eeacad4b874f3d187fb551afc47347a5160fbbb5395eadc82086f0a30588" exitCode=0 Apr 24 19:11:16.904179 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.903965 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerDied","Data":"6c3db2f31169f389829359cc6a01ba1b23fa0fe8833ccd17b0f06908034aabe1"} Apr 24 19:11:16.904179 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.904005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerDied","Data":"72bd3404501c7332f367dd3958d4fca375a46b121b092a2c9c6abf58d1a2b1aa"} Apr 24 19:11:16.904179 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.904016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerDied","Data":"3558f8c5968722c483b292729fff2c143ca1277c7140e403c41795d75ba498ea"} Apr 24 19:11:16.904179 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:16.904026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerDied","Data":"9978eeacad4b874f3d187fb551afc47347a5160fbbb5395eadc82086f0a30588"} Apr 24 19:11:17.910433 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:17.910399 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0adb170-f181-4624-b689-5afa3a246a87" containerID="94f3d0e6bf5efddc90e4fa5ec12e590eb2abb89a1693643e2c72d6902d4a2ed5" exitCode=0 Apr 24 19:11:17.910433 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:17.910427 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0adb170-f181-4624-b689-5afa3a246a87" containerID="6dac3a24ffc401d6ec34d7b854119caa834f6e94047689fda06335d86972c5d5" exitCode=0 Apr 24 19:11:17.910870 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:17.910469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerDied","Data":"94f3d0e6bf5efddc90e4fa5ec12e590eb2abb89a1693643e2c72d6902d4a2ed5"} Apr 24 19:11:17.910870 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:17.910502 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerDied","Data":"6dac3a24ffc401d6ec34d7b854119caa834f6e94047689fda06335d86972c5d5"} Apr 24 19:11:18.023533 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.023507 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:18.148504 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148414 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r26t2\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-kube-api-access-r26t2\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148504 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148446 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-cluster-tls-config\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148504 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148482 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-tls-assets\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148504 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148506 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-metric\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148530 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-metrics-client-ca\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148556 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148582 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-main-db\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148648 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-web\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148675 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-config-volume\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148704 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-main-tls\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148739 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148772 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-config-out\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.148897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.148799 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-web-config\") pod \"c0adb170-f181-4624-b689-5afa3a246a87\" (UID: \"c0adb170-f181-4624-b689-5afa3a246a87\") " Apr 24 19:11:18.149316 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.149060 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:18.149418 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.149312 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:11:18.149479 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.149436 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:11:18.151881 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.151628 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:18.152032 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.151994 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-kube-api-access-r26t2" (OuterVolumeSpecName: "kube-api-access-r26t2") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "kube-api-access-r26t2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:11:18.152113 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.152041 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:18.152113 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.152053 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:11:18.152268 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.152241 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-config-volume" (OuterVolumeSpecName: "config-volume") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:18.152668 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.152639 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:18.153746 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.153724 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:18.154311 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.154280 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-config-out" (OuterVolumeSpecName: "config-out") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:11:18.155810 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.155789 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:18.163867 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.163836 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-web-config" (OuterVolumeSpecName: "web-config") pod "c0adb170-f181-4624-b689-5afa3a246a87" (UID: "c0adb170-f181-4624-b689-5afa3a246a87"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:11:18.249840 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249793 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r26t2\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-kube-api-access-r26t2\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.249840 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249836 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-cluster-tls-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.249840 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249847 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c0adb170-f181-4624-b689-5afa3a246a87-tls-assets\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.249840 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249857 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249867 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-metrics-client-ca\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249877 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249886 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-alertmanager-main-db\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249895 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249906 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-config-volume\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249916 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-main-tls\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249925 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249933 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c0adb170-f181-4624-b689-5afa3a246a87-config-out\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.250120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.249942 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c0adb170-f181-4624-b689-5afa3a246a87-web-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:11:18.855156 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.855108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:11:18.857537 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.857512 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a81da49-19b8-407f-a961-d85a0ec045e1-metrics-certs\") pod \"network-metrics-daemon-cr4ls\" (UID: \"8a81da49-19b8-407f-a961-d85a0ec045e1\") " pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:11:18.915816 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.915780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c0adb170-f181-4624-b689-5afa3a246a87","Type":"ContainerDied","Data":"000a61e93a34b681bb3ba8716f2854ba941f8b2856c0a6fa11650b985efe8632"} Apr 24 19:11:18.916294 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.915840 2573 scope.go:117] "RemoveContainer" containerID="6c3db2f31169f389829359cc6a01ba1b23fa0fe8833ccd17b0f06908034aabe1" Apr 24 19:11:18.916294 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.915842 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:18.923568 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.923546 2573 scope.go:117] "RemoveContainer" containerID="94f3d0e6bf5efddc90e4fa5ec12e590eb2abb89a1693643e2c72d6902d4a2ed5" Apr 24 19:11:18.933172 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.933147 2573 scope.go:117] "RemoveContainer" containerID="72bd3404501c7332f367dd3958d4fca375a46b121b092a2c9c6abf58d1a2b1aa" Apr 24 19:11:18.940732 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.940712 2573 scope.go:117] "RemoveContainer" containerID="6dac3a24ffc401d6ec34d7b854119caa834f6e94047689fda06335d86972c5d5" Apr 24 19:11:18.940815 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.940757 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:11:18.945735 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.945709 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:11:18.948999 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.948979 2573 scope.go:117] "RemoveContainer" containerID="3558f8c5968722c483b292729fff2c143ca1277c7140e403c41795d75ba498ea" Apr 24 19:11:18.956418 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.956397 2573 scope.go:117] "RemoveContainer" containerID="9978eeacad4b874f3d187fb551afc47347a5160fbbb5395eadc82086f0a30588" Apr 24 19:11:18.964169 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.964147 2573 scope.go:117] "RemoveContainer" containerID="bcf64d3ecdc3c0cd3fe5af9973012a06779f851fc4a564504eff5cdeb19b63a5" Apr 24 19:11:18.970109 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970081 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:11:18.970475 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970459 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="alertmanager" Apr 24 19:11:18.970475 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970477 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="alertmanager" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970492 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fd7b7e0-171a-453f-bf0c-97984389dc91" containerName="console" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970500 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd7b7e0-171a-453f-bf0c-97984389dc91" containerName="console" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970517 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="config-reloader" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970526 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="config-reloader" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970539 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy-web" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970547 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy-web" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970556 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970564 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970575 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="prom-label-proxy" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970583 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="prom-label-proxy" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970600 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="init-config-reloader" Apr 24 19:11:18.970627 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970625 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="init-config-reloader" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970637 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy-metric" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970645 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy-metric" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970733 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy-web" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970747 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970759 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="alertmanager" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970771 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="config-reloader" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970780 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fd7b7e0-171a-453f-bf0c-97984389dc91" containerName="console" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970792 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="prom-label-proxy" Apr 24 19:11:18.971181 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.970802 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0adb170-f181-4624-b689-5afa3a246a87" containerName="kube-rbac-proxy-metric" Apr 24 19:11:18.976136 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.976114 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:18.978625 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.978587 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 19:11:18.978625 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.978598 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 19:11:18.978791 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.978649 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 19:11:18.978791 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.978683 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 19:11:18.978959 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.978944 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 19:11:18.979005 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.978989 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 19:11:18.979084 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.979060 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-w75gw\"" Apr 24 19:11:18.979503 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.979484 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 19:11:18.979584 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.979548 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 19:11:18.984132 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.984100 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 19:11:18.986786 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:18.986765 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:11:19.118787 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.118703 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h6pgv\"" Apr 24 19:11:19.119227 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.119206 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0adb170-f181-4624-b689-5afa3a246a87" path="/var/lib/kubelet/pods/c0adb170-f181-4624-b689-5afa3a246a87/volumes" Apr 24 19:11:19.126849 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.126820 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cr4ls" Apr 24 19:11:19.158037 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c30c4e19-0034-4eca-8468-efb2b6b708e4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158209 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c4e19-0034-4eca-8468-efb2b6b708e4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158209 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158093 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30c4e19-0034-4eca-8468-efb2b6b708e4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158209 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158137 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158209 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158341 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158341 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158237 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158341 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-config-volume\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158341 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158328 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-web-config\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158469 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158357 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158469 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158395 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30c4e19-0034-4eca-8468-efb2b6b708e4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158469 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30c4e19-0034-4eca-8468-efb2b6b708e4-config-out\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.158469 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.158435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4bg\" (UniqueName: \"kubernetes.io/projected/c30c4e19-0034-4eca-8468-efb2b6b708e4-kube-api-access-5k4bg\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.257584 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.257555 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cr4ls"] Apr 24 19:11:19.259255 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30c4e19-0034-4eca-8468-efb2b6b708e4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259336 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30c4e19-0034-4eca-8468-efb2b6b708e4-config-out\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259336 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4bg\" (UniqueName: \"kubernetes.io/projected/c30c4e19-0034-4eca-8468-efb2b6b708e4-kube-api-access-5k4bg\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259878 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c30c4e19-0034-4eca-8468-efb2b6b708e4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259878 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c4e19-0034-4eca-8468-efb2b6b708e4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259878 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30c4e19-0034-4eca-8468-efb2b6b708e4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259878 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259878 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259878 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c30c4e19-0034-4eca-8468-efb2b6b708e4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.259878 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.259805 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.260287 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.260143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30c4e19-0034-4eca-8468-efb2b6b708e4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.260287 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.260147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.260287 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.260211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-config-volume\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.260287 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.260250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-web-config\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.260474 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.260290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.261149 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.261121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c4e19-0034-4eca-8468-efb2b6b708e4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.262268 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.262128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30c4e19-0034-4eca-8468-efb2b6b708e4-config-out\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.263273 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.262750 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30c4e19-0034-4eca-8468-efb2b6b708e4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.263273 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.263217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.264158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.264124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-web-config\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.264820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.264593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.264820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.264676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-config-volume\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.264820 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.264709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.265120 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.265098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.265557 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.265539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c30c4e19-0034-4eca-8468-efb2b6b708e4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.268166 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.268142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4bg\" (UniqueName: \"kubernetes.io/projected/c30c4e19-0034-4eca-8468-efb2b6b708e4-kube-api-access-5k4bg\") pod \"alertmanager-main-0\" (UID: \"c30c4e19-0034-4eca-8468-efb2b6b708e4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.287662 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.287625 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 19:11:19.422282 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.422196 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 19:11:19.425096 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:11:19.425063 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30c4e19_0034_4eca_8468_efb2b6b708e4.slice/crio-3b6f8be14c50f7cef6c72eb93e500e82bf4142512b72412a16d206451a53dc64 WatchSource:0}: Error finding container 3b6f8be14c50f7cef6c72eb93e500e82bf4142512b72412a16d206451a53dc64: Status 404 returned error can't find the container with id 3b6f8be14c50f7cef6c72eb93e500e82bf4142512b72412a16d206451a53dc64 Apr 24 19:11:19.922170 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.922132 2573 generic.go:358] "Generic (PLEG): container finished" podID="c30c4e19-0034-4eca-8468-efb2b6b708e4" containerID="8a24b72cc4235c35ab4daffee998834ac84398f1aad309d332c4e71f976c132f" exitCode=0 Apr 24 19:11:19.922628 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.922224 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c4e19-0034-4eca-8468-efb2b6b708e4","Type":"ContainerDied","Data":"8a24b72cc4235c35ab4daffee998834ac84398f1aad309d332c4e71f976c132f"} Apr 24 19:11:19.922628 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.922270 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c4e19-0034-4eca-8468-efb2b6b708e4","Type":"ContainerStarted","Data":"3b6f8be14c50f7cef6c72eb93e500e82bf4142512b72412a16d206451a53dc64"} Apr 24 19:11:19.924626 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:19.924572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cr4ls" event={"ID":"8a81da49-19b8-407f-a961-d85a0ec045e1","Type":"ContainerStarted","Data":"a594413b2f4256e247735a3719164994fddcad2e4da7ba344515d16d544d5bce"} Apr 24 19:11:20.825200 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.825165 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-56bb55b79d-kdvgh"] Apr 24 19:11:20.828792 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.828769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.832670 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.832558 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 19:11:20.832670 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.832654 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 19:11:20.832948 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.832743 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 19:11:20.832948 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.832905 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-gwfmk\"" Apr 24 19:11:20.832948 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.832931 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 19:11:20.832948 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.832945 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 19:11:20.840456 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.840427 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 19:11:20.853688 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.853659 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56bb55b79d-kdvgh"] Apr 24 19:11:20.876182 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.876143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-serving-certs-ca-bundle\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.876182 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.876185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-telemeter-client-tls\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.876400 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.876242 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.876400 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.876259 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-metrics-client-ca\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.876400 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.876336 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-federate-client-tls\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.876400 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.876362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.876519 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.876404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnrqr\" (UniqueName: \"kubernetes.io/projected/25a2f786-c145-4f81-9f1e-39b38e46a058-kube-api-access-xnrqr\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.876519 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.876433 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-secret-telemeter-client\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.930960 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.930918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c4e19-0034-4eca-8468-efb2b6b708e4","Type":"ContainerStarted","Data":"5e2f6d689c27dbc370fbcbcea0fa3ffb278eb98d9c881ca73bda9ac929d0969f"} Apr 24 19:11:20.930960 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.930957 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c4e19-0034-4eca-8468-efb2b6b708e4","Type":"ContainerStarted","Data":"e6113c4c639e95be7612364195995400fcf973d243d0ccda9be3c20b35462b1a"} Apr 24 19:11:20.930960 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.930967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c4e19-0034-4eca-8468-efb2b6b708e4","Type":"ContainerStarted","Data":"5b09679772bb4ddcfce97cb766aa13e572875567f001fd2b7349248e113b7a6b"} Apr 24 19:11:20.931502 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.930981 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c4e19-0034-4eca-8468-efb2b6b708e4","Type":"ContainerStarted","Data":"f379267c4266efe4f2dbc0f3b4c55c8ee9fd11f5e77efd9708045277a1f919aa"} Apr 24 19:11:20.931502 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.930993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c4e19-0034-4eca-8468-efb2b6b708e4","Type":"ContainerStarted","Data":"7d3b1c5a757249c1b0b86eb262df72824ca088dbc8ae9e07b85432d110cb12ef"} Apr 24 19:11:20.931502 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.931003 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c4e19-0034-4eca-8468-efb2b6b708e4","Type":"ContainerStarted","Data":"dece0000a7a75668f1e830f590d22d6a40bf88183b6f091c50e8abf87e71a230"} Apr 24 19:11:20.932496 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.932470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cr4ls" event={"ID":"8a81da49-19b8-407f-a961-d85a0ec045e1","Type":"ContainerStarted","Data":"d0354c6f333fa4bec739852cf37f789e02fe259739ad6035d82f09b7aec18be8"} Apr 24 19:11:20.932640 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.932505 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cr4ls" event={"ID":"8a81da49-19b8-407f-a961-d85a0ec045e1","Type":"ContainerStarted","Data":"c43f4b581ef97c27e0f995b1afae16e389a22d09894335ea2f1bcb0d79ff1c38"} Apr 24 19:11:20.966548 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.966480 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.966463107 podStartE2EDuration="2.966463107s" podCreationTimestamp="2026-04-24 19:11:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:11:20.964943828 +0000 UTC m=+254.435103370" watchObservedRunningTime="2026-04-24 19:11:20.966463107 +0000 UTC m=+254.436622665" Apr 24 19:11:20.977212 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.977178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnrqr\" (UniqueName: \"kubernetes.io/projected/25a2f786-c145-4f81-9f1e-39b38e46a058-kube-api-access-xnrqr\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.977395 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.977230 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-secret-telemeter-client\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.977395 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.977295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-serving-certs-ca-bundle\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.977395 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.977342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-telemeter-client-tls\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.977562 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.977456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.977562 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.977487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-metrics-client-ca\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.978421 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.978388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-serving-certs-ca-bundle\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.978596 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.978580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-federate-client-tls\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.978780 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.978753 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.979163 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.979138 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-metrics-client-ca\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.979521 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.979497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a2f786-c145-4f81-9f1e-39b38e46a058-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.980843 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.980816 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.981301 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.981283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-secret-telemeter-client\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.981391 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.981295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-federate-client-tls\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.981541 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.981519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/25a2f786-c145-4f81-9f1e-39b38e46a058-telemeter-client-tls\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:20.984225 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.984171 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cr4ls" podStartSLOduration=252.873684728 podStartE2EDuration="4m13.984154509s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:11:19.264291707 +0000 UTC m=+252.734451215" lastFinishedPulling="2026-04-24 19:11:20.374761488 +0000 UTC m=+253.844920996" observedRunningTime="2026-04-24 19:11:20.982582268 +0000 UTC m=+254.452741797" watchObservedRunningTime="2026-04-24 19:11:20.984154509 +0000 UTC m=+254.454314037" Apr 24 19:11:20.987743 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:20.987707 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnrqr\" (UniqueName: \"kubernetes.io/projected/25a2f786-c145-4f81-9f1e-39b38e46a058-kube-api-access-xnrqr\") pod \"telemeter-client-56bb55b79d-kdvgh\" (UID: \"25a2f786-c145-4f81-9f1e-39b38e46a058\") " pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:21.149172 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:21.149082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" Apr 24 19:11:21.282371 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:21.282345 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56bb55b79d-kdvgh"] Apr 24 19:11:21.284590 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:11:21.284557 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25a2f786_c145_4f81_9f1e_39b38e46a058.slice/crio-20f6aa84a7862966855630a27792d81aee6c85655ae7bbd76e17d9546e4a5aae WatchSource:0}: Error finding container 20f6aa84a7862966855630a27792d81aee6c85655ae7bbd76e17d9546e4a5aae: Status 404 returned error can't find the container with id 20f6aa84a7862966855630a27792d81aee6c85655ae7bbd76e17d9546e4a5aae Apr 24 19:11:21.938039 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:21.937997 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" event={"ID":"25a2f786-c145-4f81-9f1e-39b38e46a058","Type":"ContainerStarted","Data":"20f6aa84a7862966855630a27792d81aee6c85655ae7bbd76e17d9546e4a5aae"} Apr 24 19:11:22.943210 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:22.943181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" event={"ID":"25a2f786-c145-4f81-9f1e-39b38e46a058","Type":"ContainerStarted","Data":"b0438b147d99240c148d21d1544f3cd27fe349ee37e989dacc6014a61c7731aa"} Apr 24 19:11:22.943210 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:22.943216 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" event={"ID":"25a2f786-c145-4f81-9f1e-39b38e46a058","Type":"ContainerStarted","Data":"737773000ddfc2558906c5cde090c757311c9839950c6fd07f21974aece37a98"} Apr 24 19:11:23.947704 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:23.947668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" event={"ID":"25a2f786-c145-4f81-9f1e-39b38e46a058","Type":"ContainerStarted","Data":"7a65ec3efe4928e500b0ef74ee57581d968f160e84ee915fd00a6c84deda9fcf"} Apr 24 19:11:23.970256 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:23.970195 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-56bb55b79d-kdvgh" podStartSLOduration=2.505900739 podStartE2EDuration="3.970178792s" podCreationTimestamp="2026-04-24 19:11:20 +0000 UTC" firstStartedPulling="2026-04-24 19:11:21.288796424 +0000 UTC m=+254.758955930" lastFinishedPulling="2026-04-24 19:11:22.753074462 +0000 UTC m=+256.223233983" observedRunningTime="2026-04-24 19:11:23.969819866 +0000 UTC m=+257.439979394" watchObservedRunningTime="2026-04-24 19:11:23.970178792 +0000 UTC m=+257.440338317" Apr 24 19:11:24.674920 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.674883 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f788fccf-x8s6p"] Apr 24 19:11:24.678437 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.678406 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.694375 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.694332 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f788fccf-x8s6p"] Apr 24 19:11:24.709554 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.709517 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbl49\" (UniqueName: \"kubernetes.io/projected/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-kube-api-access-zbl49\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.709762 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.709571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-config\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.709762 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.709692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-service-ca\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.709762 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.709725 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-serving-cert\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.709891 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.709752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-oauth-config\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.709891 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.709803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-oauth-serving-cert\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.709891 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.709860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-trusted-ca-bundle\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.810906 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.810866 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-service-ca\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.810906 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.810910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-serving-cert\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.811125 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.810946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-oauth-config\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.811125 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.810971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-oauth-serving-cert\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.811229 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.811201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-trusted-ca-bundle\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.811300 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.811283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbl49\" (UniqueName: \"kubernetes.io/projected/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-kube-api-access-zbl49\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.811366 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.811339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-config\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.811730 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.811703 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-service-ca\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.811730 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.811717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-oauth-serving-cert\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.812069 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.812044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-config\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.812137 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.812075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-trusted-ca-bundle\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.813567 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.813546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-serving-cert\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.813672 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.813582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-oauth-config\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.821209 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.821174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbl49\" (UniqueName: \"kubernetes.io/projected/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-kube-api-access-zbl49\") pod \"console-f788fccf-x8s6p\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:24.989987 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:24.989948 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:25.118468 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:11:25.118435 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073d2a6e_191c_4ce7_9ee6_c3fbf0b1dcf8.slice/crio-dac3b5e3b789650740ef5b1efa812d3616112bbda73c68d121aa1744bded907c WatchSource:0}: Error finding container dac3b5e3b789650740ef5b1efa812d3616112bbda73c68d121aa1744bded907c: Status 404 returned error can't find the container with id dac3b5e3b789650740ef5b1efa812d3616112bbda73c68d121aa1744bded907c Apr 24 19:11:25.120651 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:25.120625 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f788fccf-x8s6p"] Apr 24 19:11:25.956731 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:25.956689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f788fccf-x8s6p" event={"ID":"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8","Type":"ContainerStarted","Data":"fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f"} Apr 24 19:11:25.956731 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:25.956727 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f788fccf-x8s6p" event={"ID":"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8","Type":"ContainerStarted","Data":"dac3b5e3b789650740ef5b1efa812d3616112bbda73c68d121aa1744bded907c"} Apr 24 19:11:25.975246 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:25.975195 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f788fccf-x8s6p" podStartSLOduration=1.975178536 podStartE2EDuration="1.975178536s" podCreationTimestamp="2026-04-24 19:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:11:25.974351225 +0000 UTC m=+259.444510764" watchObservedRunningTime="2026-04-24 19:11:25.975178536 +0000 UTC m=+259.445338065" Apr 24 19:11:34.991080 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:34.991048 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:34.991569 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:34.991191 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:34.995633 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:34.995601 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:35.993594 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:35.993558 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:11:36.042007 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:11:36.041974 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85c89f5d9d-jbs9q"] Apr 24 19:12:01.065541 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.065487 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85c89f5d9d-jbs9q" podUID="e37b41e3-bd25-4312-8c8a-e590bba67bd6" containerName="console" containerID="cri-o://4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14" gracePeriod=15 Apr 24 19:12:01.311535 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.311513 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85c89f5d9d-jbs9q_e37b41e3-bd25-4312-8c8a-e590bba67bd6/console/0.log" Apr 24 19:12:01.311677 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.311571 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:12:01.422843 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.422755 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcv2n\" (UniqueName: \"kubernetes.io/projected/e37b41e3-bd25-4312-8c8a-e590bba67bd6-kube-api-access-zcv2n\") pod \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " Apr 24 19:12:01.422843 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.422804 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-oauth-serving-cert\") pod \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " Apr 24 19:12:01.422843 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.422827 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-config\") pod \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " Apr 24 19:12:01.423126 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.422852 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-serving-cert\") pod \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " Apr 24 19:12:01.423126 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.422873 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-service-ca\") pod \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " Apr 24 19:12:01.423126 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.422956 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-trusted-ca-bundle\") pod \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " Apr 24 19:12:01.423126 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.423014 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-oauth-config\") pod \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\" (UID: \"e37b41e3-bd25-4312-8c8a-e590bba67bd6\") " Apr 24 19:12:01.423328 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.423299 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-config" (OuterVolumeSpecName: "console-config") pod "e37b41e3-bd25-4312-8c8a-e590bba67bd6" (UID: "e37b41e3-bd25-4312-8c8a-e590bba67bd6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:12:01.423378 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.423333 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e37b41e3-bd25-4312-8c8a-e590bba67bd6" (UID: "e37b41e3-bd25-4312-8c8a-e590bba67bd6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:12:01.423378 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.423351 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-service-ca" (OuterVolumeSpecName: "service-ca") pod "e37b41e3-bd25-4312-8c8a-e590bba67bd6" (UID: "e37b41e3-bd25-4312-8c8a-e590bba67bd6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:12:01.423456 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.423386 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e37b41e3-bd25-4312-8c8a-e590bba67bd6" (UID: "e37b41e3-bd25-4312-8c8a-e590bba67bd6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:12:01.425036 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.425006 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e37b41e3-bd25-4312-8c8a-e590bba67bd6" (UID: "e37b41e3-bd25-4312-8c8a-e590bba67bd6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:12:01.425036 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.425026 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e37b41e3-bd25-4312-8c8a-e590bba67bd6" (UID: "e37b41e3-bd25-4312-8c8a-e590bba67bd6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:12:01.425195 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.425058 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37b41e3-bd25-4312-8c8a-e590bba67bd6-kube-api-access-zcv2n" (OuterVolumeSpecName: "kube-api-access-zcv2n") pod "e37b41e3-bd25-4312-8c8a-e590bba67bd6" (UID: "e37b41e3-bd25-4312-8c8a-e590bba67bd6"). InnerVolumeSpecName "kube-api-access-zcv2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:12:01.523940 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.523905 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-trusted-ca-bundle\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:12:01.523940 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.523933 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-oauth-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:12:01.523940 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.523943 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zcv2n\" (UniqueName: \"kubernetes.io/projected/e37b41e3-bd25-4312-8c8a-e590bba67bd6-kube-api-access-zcv2n\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:12:01.524166 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.523952 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-oauth-serving-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:12:01.524166 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.523963 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:12:01.524166 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.523972 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e37b41e3-bd25-4312-8c8a-e590bba67bd6-console-serving-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:12:01.524166 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:01.523982 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e37b41e3-bd25-4312-8c8a-e590bba67bd6-service-ca\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:12:02.072000 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.071971 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85c89f5d9d-jbs9q_e37b41e3-bd25-4312-8c8a-e590bba67bd6/console/0.log" Apr 24 19:12:02.072455 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.072017 2573 generic.go:358] "Generic (PLEG): container finished" podID="e37b41e3-bd25-4312-8c8a-e590bba67bd6" containerID="4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14" exitCode=2 Apr 24 19:12:02.072455 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.072082 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85c89f5d9d-jbs9q" Apr 24 19:12:02.072455 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.072103 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85c89f5d9d-jbs9q" event={"ID":"e37b41e3-bd25-4312-8c8a-e590bba67bd6","Type":"ContainerDied","Data":"4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14"} Apr 24 19:12:02.072455 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.072145 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85c89f5d9d-jbs9q" event={"ID":"e37b41e3-bd25-4312-8c8a-e590bba67bd6","Type":"ContainerDied","Data":"fa885374d116af942fa95a1651807d3a8d4955774ac23b651c00f2526fcaeb00"} Apr 24 19:12:02.072455 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.072162 2573 scope.go:117] "RemoveContainer" containerID="4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14" Apr 24 19:12:02.080521 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.080501 2573 scope.go:117] "RemoveContainer" containerID="4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14" Apr 24 19:12:02.080796 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:12:02.080778 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14\": container with ID starting with 4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14 not found: ID does not exist" containerID="4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14" Apr 24 19:12:02.080853 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.080806 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14"} err="failed to get container status \"4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14\": rpc error: code = NotFound desc = could not find container \"4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14\": container with ID starting with 4e9b6a85842146809888aab51da663efeeb8f1ceec8b0812d3fb7812a152bb14 not found: ID does not exist" Apr 24 19:12:02.093134 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.093111 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85c89f5d9d-jbs9q"] Apr 24 19:12:02.098029 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:02.098008 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85c89f5d9d-jbs9q"] Apr 24 19:12:03.119636 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:03.119568 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37b41e3-bd25-4312-8c8a-e590bba67bd6" path="/var/lib/kubelet/pods/e37b41e3-bd25-4312-8c8a-e590bba67bd6/volumes" Apr 24 19:12:06.993705 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:06.993674 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:12:06.994252 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:06.994229 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:12:07.003552 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:07.003527 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:12:07.004290 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:07.004271 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:12:07.007147 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:07.007134 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 19:12:32.933826 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.933787 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-764fd8f9f6-bwnwc"] Apr 24 19:12:32.936248 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.934160 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e37b41e3-bd25-4312-8c8a-e590bba67bd6" containerName="console" Apr 24 19:12:32.936248 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.934180 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37b41e3-bd25-4312-8c8a-e590bba67bd6" containerName="console" Apr 24 19:12:32.936248 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.934295 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e37b41e3-bd25-4312-8c8a-e590bba67bd6" containerName="console" Apr 24 19:12:32.937309 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.937285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:32.946007 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.945975 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764fd8f9f6-bwnwc"] Apr 24 19:12:32.977158 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.977128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhh6s\" (UniqueName: \"kubernetes.io/projected/186b59ca-f095-423a-a635-46c8087c0242-kube-api-access-vhh6s\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:32.977447 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.977424 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-service-ca\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:32.977569 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.977554 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-oauth-serving-cert\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:32.977710 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.977695 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-oauth-config\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:32.977861 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.977846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-console-config\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:32.978043 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.978021 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-serving-cert\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:32.978156 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:32.978140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-trusted-ca-bundle\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.079355 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.079320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-serving-cert\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.079355 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.079357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-trusted-ca-bundle\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.079573 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.079382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhh6s\" (UniqueName: \"kubernetes.io/projected/186b59ca-f095-423a-a635-46c8087c0242-kube-api-access-vhh6s\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.079573 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.079437 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-service-ca\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.079573 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.079455 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-oauth-serving-cert\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.079573 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.079485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-oauth-config\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.079573 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.079549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-console-config\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.080183 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.080157 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-oauth-serving-cert\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.080299 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.080278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-service-ca\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.080342 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.080288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-trusted-ca-bundle\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.080398 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.080375 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-console-config\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.081976 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.081946 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-serving-cert\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.081976 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.081968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-oauth-config\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.087231 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.087207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhh6s\" (UniqueName: \"kubernetes.io/projected/186b59ca-f095-423a-a635-46c8087c0242-kube-api-access-vhh6s\") pod \"console-764fd8f9f6-bwnwc\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.248018 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.247920 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:33.370382 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.370348 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764fd8f9f6-bwnwc"] Apr 24 19:12:33.376375 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:12:33.376330 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod186b59ca_f095_423a_a635_46c8087c0242.slice/crio-303db310920d139abde9a68add2e2dd547d86e356f05cd14823552859f6d2d18 WatchSource:0}: Error finding container 303db310920d139abde9a68add2e2dd547d86e356f05cd14823552859f6d2d18: Status 404 returned error can't find the container with id 303db310920d139abde9a68add2e2dd547d86e356f05cd14823552859f6d2d18 Apr 24 19:12:33.377879 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:33.377860 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:12:34.171865 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:34.171823 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764fd8f9f6-bwnwc" event={"ID":"186b59ca-f095-423a-a635-46c8087c0242","Type":"ContainerStarted","Data":"e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b"} Apr 24 19:12:34.171865 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:34.171870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764fd8f9f6-bwnwc" event={"ID":"186b59ca-f095-423a-a635-46c8087c0242","Type":"ContainerStarted","Data":"303db310920d139abde9a68add2e2dd547d86e356f05cd14823552859f6d2d18"} Apr 24 19:12:34.202216 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:34.202170 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-764fd8f9f6-bwnwc" podStartSLOduration=2.202154382 podStartE2EDuration="2.202154382s" podCreationTimestamp="2026-04-24 19:12:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:12:34.200983856 +0000 UTC m=+327.671143385" watchObservedRunningTime="2026-04-24 19:12:34.202154382 +0000 UTC m=+327.672313909" Apr 24 19:12:43.248879 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:43.248828 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:43.249320 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:43.248908 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:43.253623 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:43.253586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:44.204625 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:44.204591 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:12:44.250091 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:12:44.250058 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f788fccf-x8s6p"] Apr 24 19:13:09.270442 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.270331 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f788fccf-x8s6p" podUID="073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" containerName="console" containerID="cri-o://fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f" gracePeriod=15 Apr 24 19:13:09.511383 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.511361 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f788fccf-x8s6p_073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8/console/0.log" Apr 24 19:13:09.511511 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.511423 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:13:09.689111 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689079 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-config\") pod \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " Apr 24 19:13:09.689111 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689117 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-trusted-ca-bundle\") pod \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " Apr 24 19:13:09.689325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689158 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-service-ca\") pod \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " Apr 24 19:13:09.689325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689178 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-oauth-config\") pod \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " Apr 24 19:13:09.689325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689222 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-serving-cert\") pod \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " Apr 24 19:13:09.689325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689258 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-oauth-serving-cert\") pod \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " Apr 24 19:13:09.689325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689308 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbl49\" (UniqueName: \"kubernetes.io/projected/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-kube-api-access-zbl49\") pod \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\" (UID: \"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8\") " Apr 24 19:13:09.689579 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689498 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-config" (OuterVolumeSpecName: "console-config") pod "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" (UID: "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:13:09.689674 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689530 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-service-ca" (OuterVolumeSpecName: "service-ca") pod "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" (UID: "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:13:09.689805 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689775 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" (UID: "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:13:09.689893 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.689868 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" (UID: "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:13:09.691374 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.691348 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" (UID: "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:13:09.691500 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.691371 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-kube-api-access-zbl49" (OuterVolumeSpecName: "kube-api-access-zbl49") pod "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" (UID: "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8"). InnerVolumeSpecName "kube-api-access-zbl49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:13:09.691500 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.691435 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" (UID: "073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:13:09.789974 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.789935 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-service-ca\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:09.789974 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.789967 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-oauth-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:09.789974 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.789982 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-serving-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:09.790213 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.789996 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-oauth-serving-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:09.790213 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.790009 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbl49\" (UniqueName: \"kubernetes.io/projected/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-kube-api-access-zbl49\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:09.790213 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.790022 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-console-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:09.790213 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:09.790033 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8-trusted-ca-bundle\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:10.288989 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.288960 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f788fccf-x8s6p_073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8/console/0.log" Apr 24 19:13:10.289426 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.289000 2573 generic.go:358] "Generic (PLEG): container finished" podID="073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" containerID="fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f" exitCode=2 Apr 24 19:13:10.289426 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.289036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f788fccf-x8s6p" event={"ID":"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8","Type":"ContainerDied","Data":"fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f"} Apr 24 19:13:10.289426 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.289061 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f788fccf-x8s6p" event={"ID":"073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8","Type":"ContainerDied","Data":"dac3b5e3b789650740ef5b1efa812d3616112bbda73c68d121aa1744bded907c"} Apr 24 19:13:10.289426 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.289076 2573 scope.go:117] "RemoveContainer" containerID="fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f" Apr 24 19:13:10.289426 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.289094 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f788fccf-x8s6p" Apr 24 19:13:10.298300 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.298282 2573 scope.go:117] "RemoveContainer" containerID="fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f" Apr 24 19:13:10.298561 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:10.298538 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f\": container with ID starting with fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f not found: ID does not exist" containerID="fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f" Apr 24 19:13:10.298645 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.298571 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f"} err="failed to get container status \"fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f\": rpc error: code = NotFound desc = could not find container \"fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f\": container with ID starting with fdf027c582030f012fcaa19d59c8438daae3bae9f20cce69fb6112e172934c0f not found: ID does not exist" Apr 24 19:13:10.310792 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.310767 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f788fccf-x8s6p"] Apr 24 19:13:10.316535 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:10.316513 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f788fccf-x8s6p"] Apr 24 19:13:11.119675 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:11.119633 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" path="/var/lib/kubelet/pods/073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8/volumes" Apr 24 19:13:29.792990 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.792954 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl"] Apr 24 19:13:29.793392 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.793283 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" containerName="console" Apr 24 19:13:29.793392 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.793294 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" containerName="console" Apr 24 19:13:29.793392 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.793350 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="073d2a6e-191c-4ce7-9ee6-c3fbf0b1dcf8" containerName="console" Apr 24 19:13:29.796387 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.796370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.798972 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.798949 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-xf2hd\"" Apr 24 19:13:29.799115 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.798949 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 19:13:29.799748 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.799735 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 19:13:29.807837 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.807808 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl"] Apr 24 19:13:29.847535 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.847485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svgwn\" (UniqueName: \"kubernetes.io/projected/b4901f91-114f-44f1-85c2-a86960afc2f0-kube-api-access-svgwn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.847756 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.847553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.847756 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.847587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.948445 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.948396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.948644 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.948478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svgwn\" (UniqueName: \"kubernetes.io/projected/b4901f91-114f-44f1-85c2-a86960afc2f0-kube-api-access-svgwn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.948644 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.948520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.948897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.948876 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.948936 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.948880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:29.956854 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:29.956833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svgwn\" (UniqueName: \"kubernetes.io/projected/b4901f91-114f-44f1-85c2-a86960afc2f0-kube-api-access-svgwn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:30.110773 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:30.110687 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:30.233304 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:30.233186 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl"] Apr 24 19:13:30.235831 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:13:30.235801 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4901f91_114f_44f1_85c2_a86960afc2f0.slice/crio-a763a946e932d0ee078dd78dc3a06cd9c61bdc715bdc1005ba1c668bb1b6e3c2 WatchSource:0}: Error finding container a763a946e932d0ee078dd78dc3a06cd9c61bdc715bdc1005ba1c668bb1b6e3c2: Status 404 returned error can't find the container with id a763a946e932d0ee078dd78dc3a06cd9c61bdc715bdc1005ba1c668bb1b6e3c2 Apr 24 19:13:30.348199 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:30.348155 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" event={"ID":"b4901f91-114f-44f1-85c2-a86960afc2f0","Type":"ContainerStarted","Data":"a763a946e932d0ee078dd78dc3a06cd9c61bdc715bdc1005ba1c668bb1b6e3c2"} Apr 24 19:13:36.367662 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:36.367622 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerID="34502abbe111ec1414f105f90925985f3d4b96cd424276fad20f278d997c7903" exitCode=0 Apr 24 19:13:36.368065 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:36.367713 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" event={"ID":"b4901f91-114f-44f1-85c2-a86960afc2f0","Type":"ContainerDied","Data":"34502abbe111ec1414f105f90925985f3d4b96cd424276fad20f278d997c7903"} Apr 24 19:13:38.375249 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:38.375156 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerID="fc4eda9e09169a319d746c8f046766bb2dd519397b4cd1f6b637e0cd0085c4db" exitCode=0 Apr 24 19:13:38.375717 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:38.375268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" event={"ID":"b4901f91-114f-44f1-85c2-a86960afc2f0","Type":"ContainerDied","Data":"fc4eda9e09169a319d746c8f046766bb2dd519397b4cd1f6b637e0cd0085c4db"} Apr 24 19:13:45.399727 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:45.399693 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerID="5d630be6243acb011e320ea7a318f9c28a3ced14da785ad738b47a0bb500353d" exitCode=0 Apr 24 19:13:45.400131 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:45.399763 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" event={"ID":"b4901f91-114f-44f1-85c2-a86960afc2f0","Type":"ContainerDied","Data":"5d630be6243acb011e320ea7a318f9c28a3ced14da785ad738b47a0bb500353d"} Apr 24 19:13:46.523247 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.523218 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:46.600934 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.600897 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svgwn\" (UniqueName: \"kubernetes.io/projected/b4901f91-114f-44f1-85c2-a86960afc2f0-kube-api-access-svgwn\") pod \"b4901f91-114f-44f1-85c2-a86960afc2f0\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " Apr 24 19:13:46.601099 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.600951 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-util\") pod \"b4901f91-114f-44f1-85c2-a86960afc2f0\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " Apr 24 19:13:46.601099 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.600975 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-bundle\") pod \"b4901f91-114f-44f1-85c2-a86960afc2f0\" (UID: \"b4901f91-114f-44f1-85c2-a86960afc2f0\") " Apr 24 19:13:46.601598 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.601578 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-bundle" (OuterVolumeSpecName: "bundle") pod "b4901f91-114f-44f1-85c2-a86960afc2f0" (UID: "b4901f91-114f-44f1-85c2-a86960afc2f0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:13:46.603230 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.603202 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4901f91-114f-44f1-85c2-a86960afc2f0-kube-api-access-svgwn" (OuterVolumeSpecName: "kube-api-access-svgwn") pod "b4901f91-114f-44f1-85c2-a86960afc2f0" (UID: "b4901f91-114f-44f1-85c2-a86960afc2f0"). InnerVolumeSpecName "kube-api-access-svgwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:13:46.605828 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.605805 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-util" (OuterVolumeSpecName: "util") pod "b4901f91-114f-44f1-85c2-a86960afc2f0" (UID: "b4901f91-114f-44f1-85c2-a86960afc2f0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:13:46.701992 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.701963 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svgwn\" (UniqueName: \"kubernetes.io/projected/b4901f91-114f-44f1-85c2-a86960afc2f0-kube-api-access-svgwn\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:46.701992 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.701990 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-util\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:46.701992 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:46.702001 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4901f91-114f-44f1-85c2-a86960afc2f0-bundle\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:13:47.407234 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:47.407159 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" Apr 24 19:13:47.407372 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:47.407157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4d2rl" event={"ID":"b4901f91-114f-44f1-85c2-a86960afc2f0","Type":"ContainerDied","Data":"a763a946e932d0ee078dd78dc3a06cd9c61bdc715bdc1005ba1c668bb1b6e3c2"} Apr 24 19:13:47.407372 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:47.407260 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a763a946e932d0ee078dd78dc3a06cd9c61bdc715bdc1005ba1c668bb1b6e3c2" Apr 24 19:13:51.798146 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.798113 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h"] Apr 24 19:13:51.798516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.798483 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerName="extract" Apr 24 19:13:51.798516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.798495 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerName="extract" Apr 24 19:13:51.798516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.798507 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerName="util" Apr 24 19:13:51.798516 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.798512 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerName="util" Apr 24 19:13:51.798700 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.798521 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerName="pull" Apr 24 19:13:51.798700 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.798527 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerName="pull" Apr 24 19:13:51.798700 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.798624 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4901f91-114f-44f1-85c2-a86960afc2f0" containerName="extract" Apr 24 19:13:51.848680 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.848644 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h"] Apr 24 19:13:51.848836 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.848772 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:51.851894 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.851864 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 19:13:51.852031 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.851901 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-hm89z\"" Apr 24 19:13:51.852031 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.852023 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 19:13:51.852172 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.852156 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 19:13:51.947851 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.947813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfxrp\" (UniqueName: \"kubernetes.io/projected/8b395977-f678-41cf-813e-45f64183331c-kube-api-access-kfxrp\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h\" (UID: \"8b395977-f678-41cf-813e-45f64183331c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:51.948021 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:51.947862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8b395977-f678-41cf-813e-45f64183331c-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h\" (UID: \"8b395977-f678-41cf-813e-45f64183331c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:52.049354 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:52.049272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfxrp\" (UniqueName: \"kubernetes.io/projected/8b395977-f678-41cf-813e-45f64183331c-kube-api-access-kfxrp\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h\" (UID: \"8b395977-f678-41cf-813e-45f64183331c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:52.049354 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:52.049310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8b395977-f678-41cf-813e-45f64183331c-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h\" (UID: \"8b395977-f678-41cf-813e-45f64183331c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:52.051753 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:52.051730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8b395977-f678-41cf-813e-45f64183331c-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h\" (UID: \"8b395977-f678-41cf-813e-45f64183331c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:52.057635 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:52.057580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfxrp\" (UniqueName: \"kubernetes.io/projected/8b395977-f678-41cf-813e-45f64183331c-kube-api-access-kfxrp\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h\" (UID: \"8b395977-f678-41cf-813e-45f64183331c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:52.158817 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:52.158782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:52.288965 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:52.288941 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h"] Apr 24 19:13:52.290760 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:13:52.290730 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b395977_f678_41cf_813e_45f64183331c.slice/crio-bacca2ea61afa064cc4ad65b244cc543ce7002251c166ac4506bcf14650419ab WatchSource:0}: Error finding container bacca2ea61afa064cc4ad65b244cc543ce7002251c166ac4506bcf14650419ab: Status 404 returned error can't find the container with id bacca2ea61afa064cc4ad65b244cc543ce7002251c166ac4506bcf14650419ab Apr 24 19:13:52.423697 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:52.423595 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" event={"ID":"8b395977-f678-41cf-813e-45f64183331c","Type":"ContainerStarted","Data":"bacca2ea61afa064cc4ad65b244cc543ce7002251c166ac4506bcf14650419ab"} Apr 24 19:13:57.067621 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.067583 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jwlz5"] Apr 24 19:13:57.085763 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.085738 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.089592 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.089568 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 19:13:57.090053 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.090028 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-n2965\"" Apr 24 19:13:57.090888 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.090864 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 19:13:57.092721 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.092697 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jwlz5"] Apr 24 19:13:57.201973 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.201933 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.202144 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.202029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm6d9\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-kube-api-access-tm6d9\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.202144 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.202113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1216dd76-6db7-49ac-879e-624f81daa111-cabundle0\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.303100 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.303051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1216dd76-6db7-49ac-879e-624f81daa111-cabundle0\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.303307 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.303119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.303307 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.303152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tm6d9\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-kube-api-access-tm6d9\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.303307 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.303267 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 19:13:57.303307 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.303287 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 19:13:57.303307 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.303296 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jwlz5: references non-existent secret key: ca.crt Apr 24 19:13:57.303559 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.303350 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates podName:1216dd76-6db7-49ac-879e-624f81daa111 nodeName:}" failed. No retries permitted until 2026-04-24 19:13:57.803334182 +0000 UTC m=+411.273493688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates") pod "keda-operator-ffbb595cb-jwlz5" (UID: "1216dd76-6db7-49ac-879e-624f81daa111") : references non-existent secret key: ca.crt Apr 24 19:13:57.303805 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.303784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1216dd76-6db7-49ac-879e-624f81daa111-cabundle0\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.314928 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.314903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm6d9\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-kube-api-access-tm6d9\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.426182 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.426153 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm"] Apr 24 19:13:57.447747 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.447706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" event={"ID":"8b395977-f678-41cf-813e-45f64183331c","Type":"ContainerStarted","Data":"228f57da9d40b7f195c8d18ecf9e0a189b246ca53e476d0e13052ec224bf65f0"} Apr 24 19:13:57.447747 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.447750 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm"] Apr 24 19:13:57.447988 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.447855 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.447988 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.447927 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:13:57.450377 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.450354 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 19:13:57.482210 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.482153 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" podStartSLOduration=2.338690815 podStartE2EDuration="6.48213431s" podCreationTimestamp="2026-04-24 19:13:51 +0000 UTC" firstStartedPulling="2026-04-24 19:13:52.292760009 +0000 UTC m=+405.762919521" lastFinishedPulling="2026-04-24 19:13:56.436203509 +0000 UTC m=+409.906363016" observedRunningTime="2026-04-24 19:13:57.481125682 +0000 UTC m=+410.951285220" watchObservedRunningTime="2026-04-24 19:13:57.48213431 +0000 UTC m=+410.952293840" Apr 24 19:13:57.605669 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.605604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkdp6\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-kube-api-access-vkdp6\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.605873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.605687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a9856ebe-28f5-4162-b678-26cc3cee94f0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.605873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.605721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.706290 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.706203 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a9856ebe-28f5-4162-b678-26cc3cee94f0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.706290 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.706248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.706471 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.706363 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 19:13:57.706471 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.706378 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 19:13:57.706471 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.706397 2573 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 19:13:57.706471 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.706409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkdp6\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-kube-api-access-vkdp6\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.706471 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.706421 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 19:13:57.706651 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.706480 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates podName:a9856ebe-28f5-4162-b678-26cc3cee94f0 nodeName:}" failed. No retries permitted until 2026-04-24 19:13:58.206462011 +0000 UTC m=+411.676621522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates") pod "keda-metrics-apiserver-7c9f485588-xdxwm" (UID: "a9856ebe-28f5-4162-b678-26cc3cee94f0") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 19:13:57.706651 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.706588 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a9856ebe-28f5-4162-b678-26cc3cee94f0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.720786 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.720754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkdp6\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-kube-api-access-vkdp6\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:57.807897 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:57.807866 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:57.808061 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.807985 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 19:13:57.808061 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.807997 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 19:13:57.808061 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.808005 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jwlz5: references non-existent secret key: ca.crt Apr 24 19:13:57.808061 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:57.808055 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates podName:1216dd76-6db7-49ac-879e-624f81daa111 nodeName:}" failed. No retries permitted until 2026-04-24 19:13:58.808042794 +0000 UTC m=+412.278202300 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates") pod "keda-operator-ffbb595cb-jwlz5" (UID: "1216dd76-6db7-49ac-879e-624f81daa111") : references non-existent secret key: ca.crt Apr 24 19:13:58.211327 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:58.211292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:58.211818 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:58.211468 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 19:13:58.211818 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:58.211488 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 19:13:58.211818 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:58.211511 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm: references non-existent secret key: tls.crt Apr 24 19:13:58.211818 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:58.211582 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates podName:a9856ebe-28f5-4162-b678-26cc3cee94f0 nodeName:}" failed. No retries permitted until 2026-04-24 19:13:59.211561306 +0000 UTC m=+412.681720829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates") pod "keda-metrics-apiserver-7c9f485588-xdxwm" (UID: "a9856ebe-28f5-4162-b678-26cc3cee94f0") : references non-existent secret key: tls.crt Apr 24 19:13:58.816794 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:58.816757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:13:58.817010 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:58.816929 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 19:13:58.817010 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:58.816952 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 19:13:58.817010 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:58.816962 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jwlz5: references non-existent secret key: ca.crt Apr 24 19:13:58.817174 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:58.817023 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates podName:1216dd76-6db7-49ac-879e-624f81daa111 nodeName:}" failed. No retries permitted until 2026-04-24 19:14:00.817003973 +0000 UTC m=+414.287163478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates") pod "keda-operator-ffbb595cb-jwlz5" (UID: "1216dd76-6db7-49ac-879e-624f81daa111") : references non-existent secret key: ca.crt Apr 24 19:13:59.220540 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:13:59.220513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:13:59.220918 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:59.220686 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 19:13:59.220918 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:59.220704 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 19:13:59.220918 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:59.220723 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm: references non-existent secret key: tls.crt Apr 24 19:13:59.220918 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:13:59.220775 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates podName:a9856ebe-28f5-4162-b678-26cc3cee94f0 nodeName:}" failed. No retries permitted until 2026-04-24 19:14:01.220760937 +0000 UTC m=+414.690920443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates") pod "keda-metrics-apiserver-7c9f485588-xdxwm" (UID: "a9856ebe-28f5-4162-b678-26cc3cee94f0") : references non-existent secret key: tls.crt Apr 24 19:14:00.833475 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:00.833425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:14:00.834022 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:14:00.833602 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 19:14:00.834022 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:14:00.833654 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 19:14:00.834022 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:14:00.833668 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jwlz5: references non-existent secret key: ca.crt Apr 24 19:14:00.834022 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:14:00.833734 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates podName:1216dd76-6db7-49ac-879e-624f81daa111 nodeName:}" failed. No retries permitted until 2026-04-24 19:14:04.833716136 +0000 UTC m=+418.303875643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates") pod "keda-operator-ffbb595cb-jwlz5" (UID: "1216dd76-6db7-49ac-879e-624f81daa111") : references non-existent secret key: ca.crt Apr 24 19:14:01.235991 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:01.235952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:14:01.236170 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:14:01.236111 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 19:14:01.236170 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:14:01.236133 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 19:14:01.236170 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:14:01.236157 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm: references non-existent secret key: tls.crt Apr 24 19:14:01.236359 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:14:01.236223 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates podName:a9856ebe-28f5-4162-b678-26cc3cee94f0 nodeName:}" failed. No retries permitted until 2026-04-24 19:14:05.236205695 +0000 UTC m=+418.706365217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates") pod "keda-metrics-apiserver-7c9f485588-xdxwm" (UID: "a9856ebe-28f5-4162-b678-26cc3cee94f0") : references non-existent secret key: tls.crt Apr 24 19:14:04.868419 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:04.868378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:14:04.870920 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:04.870895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1216dd76-6db7-49ac-879e-624f81daa111-certificates\") pod \"keda-operator-ffbb595cb-jwlz5\" (UID: \"1216dd76-6db7-49ac-879e-624f81daa111\") " pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:14:04.896493 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:04.896442 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:14:05.022204 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:05.022178 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jwlz5"] Apr 24 19:14:05.024895 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:14:05.024863 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1216dd76_6db7_49ac_879e_624f81daa111.slice/crio-8dd5772a9e461e337fe5224afee072eeb3d82ddadcfb722ad78f66fcc6cb19f8 WatchSource:0}: Error finding container 8dd5772a9e461e337fe5224afee072eeb3d82ddadcfb722ad78f66fcc6cb19f8: Status 404 returned error can't find the container with id 8dd5772a9e461e337fe5224afee072eeb3d82ddadcfb722ad78f66fcc6cb19f8 Apr 24 19:14:05.272085 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:05.272047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:14:05.274570 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:05.274545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a9856ebe-28f5-4162-b678-26cc3cee94f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xdxwm\" (UID: \"a9856ebe-28f5-4162-b678-26cc3cee94f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:14:05.473109 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:05.473068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" event={"ID":"1216dd76-6db7-49ac-879e-624f81daa111","Type":"ContainerStarted","Data":"8dd5772a9e461e337fe5224afee072eeb3d82ddadcfb722ad78f66fcc6cb19f8"} Apr 24 19:14:05.558645 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:05.558536 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:14:05.679595 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:05.679568 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm"] Apr 24 19:14:05.682390 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:14:05.682362 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9856ebe_28f5_4162_b678_26cc3cee94f0.slice/crio-2874a91b4b2451ede4401175d50b161d8876541a0d1fa804a27ddc39395d0060 WatchSource:0}: Error finding container 2874a91b4b2451ede4401175d50b161d8876541a0d1fa804a27ddc39395d0060: Status 404 returned error can't find the container with id 2874a91b4b2451ede4401175d50b161d8876541a0d1fa804a27ddc39395d0060 Apr 24 19:14:06.478621 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:06.478577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" event={"ID":"a9856ebe-28f5-4162-b678-26cc3cee94f0","Type":"ContainerStarted","Data":"2874a91b4b2451ede4401175d50b161d8876541a0d1fa804a27ddc39395d0060"} Apr 24 19:14:09.491994 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:09.491954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" event={"ID":"a9856ebe-28f5-4162-b678-26cc3cee94f0","Type":"ContainerStarted","Data":"9f91ddc38ce2d5a303e584e08465096a0a84f3ce50766ab6a0cf9c252e58f233"} Apr 24 19:14:09.492449 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:09.492090 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:14:09.493369 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:09.493347 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" event={"ID":"1216dd76-6db7-49ac-879e-624f81daa111","Type":"ContainerStarted","Data":"097f91622bc36461ed55442d6f928a9dc6036fffbe3b444dacd76cc341809f5a"} Apr 24 19:14:09.493479 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:09.493461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:14:09.510049 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:09.510000 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" podStartSLOduration=9.251799339 podStartE2EDuration="12.509988855s" podCreationTimestamp="2026-04-24 19:13:57 +0000 UTC" firstStartedPulling="2026-04-24 19:14:05.684383483 +0000 UTC m=+419.154542997" lastFinishedPulling="2026-04-24 19:14:08.942573004 +0000 UTC m=+422.412732513" observedRunningTime="2026-04-24 19:14:09.508111017 +0000 UTC m=+422.978270542" watchObservedRunningTime="2026-04-24 19:14:09.509988855 +0000 UTC m=+422.980148383" Apr 24 19:14:09.524664 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:09.524625 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" podStartSLOduration=8.608554639 podStartE2EDuration="12.52459777s" podCreationTimestamp="2026-04-24 19:13:57 +0000 UTC" firstStartedPulling="2026-04-24 19:14:05.026536091 +0000 UTC m=+418.496695598" lastFinishedPulling="2026-04-24 19:14:08.942579223 +0000 UTC m=+422.412738729" observedRunningTime="2026-04-24 19:14:09.522965512 +0000 UTC m=+422.993125041" watchObservedRunningTime="2026-04-24 19:14:09.52459777 +0000 UTC m=+422.994757277" Apr 24 19:14:18.450873 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:18.450839 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-mzd5h" Apr 24 19:14:20.502024 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:20.501996 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xdxwm" Apr 24 19:14:30.500168 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:14:30.500076 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-jwlz5" Apr 24 19:15:03.875843 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.875807 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-bvvp6"] Apr 24 19:15:03.878505 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.878487 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:03.881933 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.881909 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 19:15:03.882082 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.882069 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-d2r95\"" Apr 24 19:15:03.882859 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.882841 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:15:03.882949 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.882873 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:15:03.887154 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.887128 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dh96v"] Apr 24 19:15:03.889496 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.889477 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:03.892945 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.892924 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-jt8nj\"" Apr 24 19:15:03.893112 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.893065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-bvvp6"] Apr 24 19:15:03.893330 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.893306 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 19:15:03.904049 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.904028 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dh96v"] Apr 24 19:15:03.994057 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.994016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80404089-e731-4d12-82e8-02e3a6a9f8e0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dh96v\" (UID: \"80404089-e731-4d12-82e8-02e3a6a9f8e0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:03.994247 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.994068 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4tmz\" (UniqueName: \"kubernetes.io/projected/ef970dc7-b5b1-44fb-888a-812c600a9edc-kube-api-access-p4tmz\") pod \"kserve-controller-manager-8cdbbc8b5-bvvp6\" (UID: \"ef970dc7-b5b1-44fb-888a-812c600a9edc\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:03.994247 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.994128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef970dc7-b5b1-44fb-888a-812c600a9edc-cert\") pod \"kserve-controller-manager-8cdbbc8b5-bvvp6\" (UID: \"ef970dc7-b5b1-44fb-888a-812c600a9edc\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:03.994247 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:03.994154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwhlt\" (UniqueName: \"kubernetes.io/projected/80404089-e731-4d12-82e8-02e3a6a9f8e0-kube-api-access-fwhlt\") pod \"llmisvc-controller-manager-68cc5db7c4-dh96v\" (UID: \"80404089-e731-4d12-82e8-02e3a6a9f8e0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:04.095039 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.095002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4tmz\" (UniqueName: \"kubernetes.io/projected/ef970dc7-b5b1-44fb-888a-812c600a9edc-kube-api-access-p4tmz\") pod \"kserve-controller-manager-8cdbbc8b5-bvvp6\" (UID: \"ef970dc7-b5b1-44fb-888a-812c600a9edc\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:04.095240 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.095049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef970dc7-b5b1-44fb-888a-812c600a9edc-cert\") pod \"kserve-controller-manager-8cdbbc8b5-bvvp6\" (UID: \"ef970dc7-b5b1-44fb-888a-812c600a9edc\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:04.095240 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.095078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwhlt\" (UniqueName: \"kubernetes.io/projected/80404089-e731-4d12-82e8-02e3a6a9f8e0-kube-api-access-fwhlt\") pod \"llmisvc-controller-manager-68cc5db7c4-dh96v\" (UID: \"80404089-e731-4d12-82e8-02e3a6a9f8e0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:04.095240 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.095151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80404089-e731-4d12-82e8-02e3a6a9f8e0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dh96v\" (UID: \"80404089-e731-4d12-82e8-02e3a6a9f8e0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:04.097584 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.097557 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef970dc7-b5b1-44fb-888a-812c600a9edc-cert\") pod \"kserve-controller-manager-8cdbbc8b5-bvvp6\" (UID: \"ef970dc7-b5b1-44fb-888a-812c600a9edc\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:04.097736 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.097590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80404089-e731-4d12-82e8-02e3a6a9f8e0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dh96v\" (UID: \"80404089-e731-4d12-82e8-02e3a6a9f8e0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:04.104636 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.104588 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4tmz\" (UniqueName: \"kubernetes.io/projected/ef970dc7-b5b1-44fb-888a-812c600a9edc-kube-api-access-p4tmz\") pod \"kserve-controller-manager-8cdbbc8b5-bvvp6\" (UID: \"ef970dc7-b5b1-44fb-888a-812c600a9edc\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:04.106545 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.106521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwhlt\" (UniqueName: \"kubernetes.io/projected/80404089-e731-4d12-82e8-02e3a6a9f8e0-kube-api-access-fwhlt\") pod \"llmisvc-controller-manager-68cc5db7c4-dh96v\" (UID: \"80404089-e731-4d12-82e8-02e3a6a9f8e0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:04.188739 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.188701 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:04.199566 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.199542 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:04.325621 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.325579 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-bvvp6"] Apr 24 19:15:04.327239 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:15:04.327203 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef970dc7_b5b1_44fb_888a_812c600a9edc.slice/crio-8695597d61698539121f99c90f2916c8267b0a9ffc07305ff4813c5eabc3a045 WatchSource:0}: Error finding container 8695597d61698539121f99c90f2916c8267b0a9ffc07305ff4813c5eabc3a045: Status 404 returned error can't find the container with id 8695597d61698539121f99c90f2916c8267b0a9ffc07305ff4813c5eabc3a045 Apr 24 19:15:04.350336 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.350311 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dh96v"] Apr 24 19:15:04.353300 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:15:04.353276 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod80404089_e731_4d12_82e8_02e3a6a9f8e0.slice/crio-d724cd2440eb1229454adad2d00c7029a764dd91839c6e091ac5ab1bb3d7cf0b WatchSource:0}: Error finding container d724cd2440eb1229454adad2d00c7029a764dd91839c6e091ac5ab1bb3d7cf0b: Status 404 returned error can't find the container with id d724cd2440eb1229454adad2d00c7029a764dd91839c6e091ac5ab1bb3d7cf0b Apr 24 19:15:04.688899 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.688869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" event={"ID":"ef970dc7-b5b1-44fb-888a-812c600a9edc","Type":"ContainerStarted","Data":"8695597d61698539121f99c90f2916c8267b0a9ffc07305ff4813c5eabc3a045"} Apr 24 19:15:04.689800 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:04.689777 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" event={"ID":"80404089-e731-4d12-82e8-02e3a6a9f8e0","Type":"ContainerStarted","Data":"d724cd2440eb1229454adad2d00c7029a764dd91839c6e091ac5ab1bb3d7cf0b"} Apr 24 19:15:07.702325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:07.702283 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" event={"ID":"80404089-e731-4d12-82e8-02e3a6a9f8e0","Type":"ContainerStarted","Data":"7d139236857a1ad533534edcec4419b927d9b28848e77c476bd6c29da4fbe869"} Apr 24 19:15:07.702970 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:07.702356 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:07.703693 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:07.703672 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" event={"ID":"ef970dc7-b5b1-44fb-888a-812c600a9edc","Type":"ContainerStarted","Data":"09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d"} Apr 24 19:15:07.703809 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:07.703770 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:07.719271 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:07.719224 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" podStartSLOduration=1.486390941 podStartE2EDuration="4.719212163s" podCreationTimestamp="2026-04-24 19:15:03 +0000 UTC" firstStartedPulling="2026-04-24 19:15:04.354626952 +0000 UTC m=+477.824786461" lastFinishedPulling="2026-04-24 19:15:07.587448163 +0000 UTC m=+481.057607683" observedRunningTime="2026-04-24 19:15:07.717967911 +0000 UTC m=+481.188127461" watchObservedRunningTime="2026-04-24 19:15:07.719212163 +0000 UTC m=+481.189371690" Apr 24 19:15:07.734855 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:07.734813 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" podStartSLOduration=1.47630944 podStartE2EDuration="4.734800651s" podCreationTimestamp="2026-04-24 19:15:03 +0000 UTC" firstStartedPulling="2026-04-24 19:15:04.328795736 +0000 UTC m=+477.798955245" lastFinishedPulling="2026-04-24 19:15:07.587286946 +0000 UTC m=+481.057446456" observedRunningTime="2026-04-24 19:15:07.733464585 +0000 UTC m=+481.203624113" watchObservedRunningTime="2026-04-24 19:15:07.734800651 +0000 UTC m=+481.204960213" Apr 24 19:15:38.709014 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:38.708979 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dh96v" Apr 24 19:15:38.711928 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:38.711905 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:40.073700 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.073663 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-bvvp6"] Apr 24 19:15:40.074128 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.073902 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" podUID="ef970dc7-b5b1-44fb-888a-812c600a9edc" containerName="manager" containerID="cri-o://09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d" gracePeriod=10 Apr 24 19:15:40.098227 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.098198 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mzs94"] Apr 24 19:15:40.100671 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.100656 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:40.112436 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.112409 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mzs94"] Apr 24 19:15:40.221536 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.221501 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175a576a-5ae9-4449-94e2-ce494e7056bf-cert\") pod \"kserve-controller-manager-8cdbbc8b5-mzs94\" (UID: \"175a576a-5ae9-4449-94e2-ce494e7056bf\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:40.221725 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.221545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n7nx\" (UniqueName: \"kubernetes.io/projected/175a576a-5ae9-4449-94e2-ce494e7056bf-kube-api-access-6n7nx\") pod \"kserve-controller-manager-8cdbbc8b5-mzs94\" (UID: \"175a576a-5ae9-4449-94e2-ce494e7056bf\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:40.310771 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.310748 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:40.322443 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.322418 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef970dc7-b5b1-44fb-888a-812c600a9edc-cert\") pod \"ef970dc7-b5b1-44fb-888a-812c600a9edc\" (UID: \"ef970dc7-b5b1-44fb-888a-812c600a9edc\") " Apr 24 19:15:40.322592 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.322479 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4tmz\" (UniqueName: \"kubernetes.io/projected/ef970dc7-b5b1-44fb-888a-812c600a9edc-kube-api-access-p4tmz\") pod \"ef970dc7-b5b1-44fb-888a-812c600a9edc\" (UID: \"ef970dc7-b5b1-44fb-888a-812c600a9edc\") " Apr 24 19:15:40.322592 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.322543 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175a576a-5ae9-4449-94e2-ce494e7056bf-cert\") pod \"kserve-controller-manager-8cdbbc8b5-mzs94\" (UID: \"175a576a-5ae9-4449-94e2-ce494e7056bf\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:40.322592 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.322576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n7nx\" (UniqueName: \"kubernetes.io/projected/175a576a-5ae9-4449-94e2-ce494e7056bf-kube-api-access-6n7nx\") pod \"kserve-controller-manager-8cdbbc8b5-mzs94\" (UID: \"175a576a-5ae9-4449-94e2-ce494e7056bf\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:40.324670 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.324589 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef970dc7-b5b1-44fb-888a-812c600a9edc-cert" (OuterVolumeSpecName: "cert") pod "ef970dc7-b5b1-44fb-888a-812c600a9edc" (UID: "ef970dc7-b5b1-44fb-888a-812c600a9edc"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:15:40.324756 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.324714 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef970dc7-b5b1-44fb-888a-812c600a9edc-kube-api-access-p4tmz" (OuterVolumeSpecName: "kube-api-access-p4tmz") pod "ef970dc7-b5b1-44fb-888a-812c600a9edc" (UID: "ef970dc7-b5b1-44fb-888a-812c600a9edc"). InnerVolumeSpecName "kube-api-access-p4tmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:15:40.324819 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.324803 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/175a576a-5ae9-4449-94e2-ce494e7056bf-cert\") pod \"kserve-controller-manager-8cdbbc8b5-mzs94\" (UID: \"175a576a-5ae9-4449-94e2-ce494e7056bf\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:40.331970 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.331880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n7nx\" (UniqueName: \"kubernetes.io/projected/175a576a-5ae9-4449-94e2-ce494e7056bf-kube-api-access-6n7nx\") pod \"kserve-controller-manager-8cdbbc8b5-mzs94\" (UID: \"175a576a-5ae9-4449-94e2-ce494e7056bf\") " pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:40.423561 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.423522 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4tmz\" (UniqueName: \"kubernetes.io/projected/ef970dc7-b5b1-44fb-888a-812c600a9edc-kube-api-access-p4tmz\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:15:40.423561 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.423555 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef970dc7-b5b1-44fb-888a-812c600a9edc-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:15:40.470110 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.470083 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:40.596456 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.596433 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-mzs94"] Apr 24 19:15:40.598657 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:15:40.598628 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175a576a_5ae9_4449_94e2_ce494e7056bf.slice/crio-6c96e02a616ca2d83a020b0db6d7415fb57740b450d129000f11ddbbb9c953d3 WatchSource:0}: Error finding container 6c96e02a616ca2d83a020b0db6d7415fb57740b450d129000f11ddbbb9c953d3: Status 404 returned error can't find the container with id 6c96e02a616ca2d83a020b0db6d7415fb57740b450d129000f11ddbbb9c953d3 Apr 24 19:15:40.812149 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.812110 2573 generic.go:358] "Generic (PLEG): container finished" podID="ef970dc7-b5b1-44fb-888a-812c600a9edc" containerID="09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d" exitCode=0 Apr 24 19:15:40.812355 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.812177 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" Apr 24 19:15:40.812355 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.812197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" event={"ID":"ef970dc7-b5b1-44fb-888a-812c600a9edc","Type":"ContainerDied","Data":"09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d"} Apr 24 19:15:40.812355 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.812236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-bvvp6" event={"ID":"ef970dc7-b5b1-44fb-888a-812c600a9edc","Type":"ContainerDied","Data":"8695597d61698539121f99c90f2916c8267b0a9ffc07305ff4813c5eabc3a045"} Apr 24 19:15:40.812355 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.812250 2573 scope.go:117] "RemoveContainer" containerID="09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d" Apr 24 19:15:40.813391 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.813365 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" event={"ID":"175a576a-5ae9-4449-94e2-ce494e7056bf","Type":"ContainerStarted","Data":"6c96e02a616ca2d83a020b0db6d7415fb57740b450d129000f11ddbbb9c953d3"} Apr 24 19:15:40.823938 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.823898 2573 scope.go:117] "RemoveContainer" containerID="09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d" Apr 24 19:15:40.824452 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:15:40.824427 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d\": container with ID starting with 09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d not found: ID does not exist" containerID="09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d" Apr 24 19:15:40.824571 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.824462 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d"} err="failed to get container status \"09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d\": rpc error: code = NotFound desc = could not find container \"09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d\": container with ID starting with 09280ba243c4f29cb5414ccc919ba6dedba77b0b3b594507194c8f4a7a51332d not found: ID does not exist" Apr 24 19:15:40.837389 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.837359 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-bvvp6"] Apr 24 19:15:40.842329 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:40.842298 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-8cdbbc8b5-bvvp6"] Apr 24 19:15:41.120872 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:41.120839 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef970dc7-b5b1-44fb-888a-812c600a9edc" path="/var/lib/kubelet/pods/ef970dc7-b5b1-44fb-888a-812c600a9edc/volumes" Apr 24 19:15:41.819176 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:41.819138 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" event={"ID":"175a576a-5ae9-4449-94e2-ce494e7056bf","Type":"ContainerStarted","Data":"c5bd485defb2d59d10f5a232ac7126ccb94923c710a95ae048c3e84e063af85b"} Apr 24 19:15:41.819354 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:41.819268 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:15:41.841964 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:15:41.841915 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" podStartSLOduration=1.4951251380000001 podStartE2EDuration="1.841897934s" podCreationTimestamp="2026-04-24 19:15:40 +0000 UTC" firstStartedPulling="2026-04-24 19:15:40.599977552 +0000 UTC m=+514.070137057" lastFinishedPulling="2026-04-24 19:15:40.946750342 +0000 UTC m=+514.416909853" observedRunningTime="2026-04-24 19:15:41.840194039 +0000 UTC m=+515.310353560" watchObservedRunningTime="2026-04-24 19:15:41.841897934 +0000 UTC m=+515.312057461" Apr 24 19:16:12.828792 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:12.828761 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-8cdbbc8b5-mzs94" Apr 24 19:16:39.716713 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.716681 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85d7f4d7c9-8mw2j"] Apr 24 19:16:39.717116 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.717056 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef970dc7-b5b1-44fb-888a-812c600a9edc" containerName="manager" Apr 24 19:16:39.717116 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.717068 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef970dc7-b5b1-44fb-888a-812c600a9edc" containerName="manager" Apr 24 19:16:39.717193 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.717132 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef970dc7-b5b1-44fb-888a-812c600a9edc" containerName="manager" Apr 24 19:16:39.719325 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.719305 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.733412 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.733380 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-serving-cert\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.733571 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.733450 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d7f4d7c9-8mw2j"] Apr 24 19:16:39.733571 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.733455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-oauth-serving-cert\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.733571 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.733484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2phr8\" (UniqueName: \"kubernetes.io/projected/82c7a3b7-4f81-44c2-b03f-394040e706c8-kube-api-access-2phr8\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.733719 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.733569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-trusted-ca-bundle\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.733719 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.733633 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-config\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.733796 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.733728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-service-ca\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.733796 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.733762 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-oauth-config\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.834326 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.834293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2phr8\" (UniqueName: \"kubernetes.io/projected/82c7a3b7-4f81-44c2-b03f-394040e706c8-kube-api-access-2phr8\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.834505 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.834337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-trusted-ca-bundle\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.834505 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.834358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-config\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.834505 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.834408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-service-ca\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.834505 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.834449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-oauth-config\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.834753 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.834517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-serving-cert\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.834753 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.834601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-oauth-serving-cert\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.835265 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.835237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-config\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.835395 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.835292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-service-ca\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.835395 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.835326 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-oauth-serving-cert\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.835517 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.835394 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82c7a3b7-4f81-44c2-b03f-394040e706c8-trusted-ca-bundle\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.837079 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.837057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-oauth-config\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.837172 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.837124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82c7a3b7-4f81-44c2-b03f-394040e706c8-console-serving-cert\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:39.842677 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:39.842656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2phr8\" (UniqueName: \"kubernetes.io/projected/82c7a3b7-4f81-44c2-b03f-394040e706c8-kube-api-access-2phr8\") pod \"console-85d7f4d7c9-8mw2j\" (UID: \"82c7a3b7-4f81-44c2-b03f-394040e706c8\") " pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:40.033266 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:40.033171 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:40.161530 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:40.161502 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d7f4d7c9-8mw2j"] Apr 24 19:16:40.164254 ip-10-0-131-214 kubenswrapper[2573]: W0424 19:16:40.164214 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c7a3b7_4f81_44c2_b03f_394040e706c8.slice/crio-48c616907e133065e47ac77b0b07b085ad5602e7ecc81a52dde0081034caaffe WatchSource:0}: Error finding container 48c616907e133065e47ac77b0b07b085ad5602e7ecc81a52dde0081034caaffe: Status 404 returned error can't find the container with id 48c616907e133065e47ac77b0b07b085ad5602e7ecc81a52dde0081034caaffe Apr 24 19:16:41.018361 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:41.018321 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d7f4d7c9-8mw2j" event={"ID":"82c7a3b7-4f81-44c2-b03f-394040e706c8","Type":"ContainerStarted","Data":"763c6142d53d862dc82e1c4350efb49ad3628ffda3b60ef0a4502496158770cc"} Apr 24 19:16:41.018361 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:41.018361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d7f4d7c9-8mw2j" event={"ID":"82c7a3b7-4f81-44c2-b03f-394040e706c8","Type":"ContainerStarted","Data":"48c616907e133065e47ac77b0b07b085ad5602e7ecc81a52dde0081034caaffe"} Apr 24 19:16:41.039281 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:41.039224 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85d7f4d7c9-8mw2j" podStartSLOduration=2.039206015 podStartE2EDuration="2.039206015s" podCreationTimestamp="2026-04-24 19:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:16:41.036821174 +0000 UTC m=+574.506980714" watchObservedRunningTime="2026-04-24 19:16:41.039206015 +0000 UTC m=+574.509365547" Apr 24 19:16:50.033476 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:50.033429 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:50.034109 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:50.033521 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:50.038403 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:50.038383 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:50.053596 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:50.053575 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85d7f4d7c9-8mw2j" Apr 24 19:16:50.104790 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:16:50.104759 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-764fd8f9f6-bwnwc"] Apr 24 19:17:07.023247 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:07.023214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:17:07.024378 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:07.024356 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:17:07.028912 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:07.028889 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:17:07.030072 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:07.030053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:17:15.126276 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.126236 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-764fd8f9f6-bwnwc" podUID="186b59ca-f095-423a-a635-46c8087c0242" containerName="console" containerID="cri-o://e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b" gracePeriod=15 Apr 24 19:17:15.373942 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.373919 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764fd8f9f6-bwnwc_186b59ca-f095-423a-a635-46c8087c0242/console/0.log" Apr 24 19:17:15.374073 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.373982 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:17:15.460650 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.460597 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-serving-cert\") pod \"186b59ca-f095-423a-a635-46c8087c0242\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " Apr 24 19:17:15.460834 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.460683 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-trusted-ca-bundle\") pod \"186b59ca-f095-423a-a635-46c8087c0242\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " Apr 24 19:17:15.460834 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.460714 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-console-config\") pod \"186b59ca-f095-423a-a635-46c8087c0242\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " Apr 24 19:17:15.460834 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.460764 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-oauth-config\") pod \"186b59ca-f095-423a-a635-46c8087c0242\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " Apr 24 19:17:15.460834 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.460786 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhh6s\" (UniqueName: \"kubernetes.io/projected/186b59ca-f095-423a-a635-46c8087c0242-kube-api-access-vhh6s\") pod \"186b59ca-f095-423a-a635-46c8087c0242\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " Apr 24 19:17:15.460834 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.460807 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-oauth-serving-cert\") pod \"186b59ca-f095-423a-a635-46c8087c0242\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " Apr 24 19:17:15.461094 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.460990 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-service-ca\") pod \"186b59ca-f095-423a-a635-46c8087c0242\" (UID: \"186b59ca-f095-423a-a635-46c8087c0242\") " Apr 24 19:17:15.461644 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.461523 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "186b59ca-f095-423a-a635-46c8087c0242" (UID: "186b59ca-f095-423a-a635-46c8087c0242"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:17:15.461644 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.461583 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "186b59ca-f095-423a-a635-46c8087c0242" (UID: "186b59ca-f095-423a-a635-46c8087c0242"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:17:15.461644 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.461587 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-console-config" (OuterVolumeSpecName: "console-config") pod "186b59ca-f095-423a-a635-46c8087c0242" (UID: "186b59ca-f095-423a-a635-46c8087c0242"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:17:15.461999 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.461960 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-service-ca" (OuterVolumeSpecName: "service-ca") pod "186b59ca-f095-423a-a635-46c8087c0242" (UID: "186b59ca-f095-423a-a635-46c8087c0242"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:17:15.467563 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.467531 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "186b59ca-f095-423a-a635-46c8087c0242" (UID: "186b59ca-f095-423a-a635-46c8087c0242"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:17:15.467720 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.467675 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "186b59ca-f095-423a-a635-46c8087c0242" (UID: "186b59ca-f095-423a-a635-46c8087c0242"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:17:15.469031 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.469008 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186b59ca-f095-423a-a635-46c8087c0242-kube-api-access-vhh6s" (OuterVolumeSpecName: "kube-api-access-vhh6s") pod "186b59ca-f095-423a-a635-46c8087c0242" (UID: "186b59ca-f095-423a-a635-46c8087c0242"). InnerVolumeSpecName "kube-api-access-vhh6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:17:15.562582 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.562548 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-oauth-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:17:15.562582 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.562578 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vhh6s\" (UniqueName: \"kubernetes.io/projected/186b59ca-f095-423a-a635-46c8087c0242-kube-api-access-vhh6s\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:17:15.562582 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.562589 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-oauth-serving-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:17:15.562864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.562599 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-service-ca\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:17:15.562864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.562634 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/186b59ca-f095-423a-a635-46c8087c0242-console-serving-cert\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:17:15.562864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.562649 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-trusted-ca-bundle\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:17:15.562864 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:15.562663 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/186b59ca-f095-423a-a635-46c8087c0242-console-config\") on node \"ip-10-0-131-214.ec2.internal\" DevicePath \"\"" Apr 24 19:17:16.137960 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.137932 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764fd8f9f6-bwnwc_186b59ca-f095-423a-a635-46c8087c0242/console/0.log" Apr 24 19:17:16.138423 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.137972 2573 generic.go:358] "Generic (PLEG): container finished" podID="186b59ca-f095-423a-a635-46c8087c0242" containerID="e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b" exitCode=2 Apr 24 19:17:16.138423 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.138005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764fd8f9f6-bwnwc" event={"ID":"186b59ca-f095-423a-a635-46c8087c0242","Type":"ContainerDied","Data":"e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b"} Apr 24 19:17:16.138423 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.138051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764fd8f9f6-bwnwc" event={"ID":"186b59ca-f095-423a-a635-46c8087c0242","Type":"ContainerDied","Data":"303db310920d139abde9a68add2e2dd547d86e356f05cd14823552859f6d2d18"} Apr 24 19:17:16.138423 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.138066 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764fd8f9f6-bwnwc" Apr 24 19:17:16.138423 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.138075 2573 scope.go:117] "RemoveContainer" containerID="e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b" Apr 24 19:17:16.146646 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.146629 2573 scope.go:117] "RemoveContainer" containerID="e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b" Apr 24 19:17:16.146906 ip-10-0-131-214 kubenswrapper[2573]: E0424 19:17:16.146886 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b\": container with ID starting with e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b not found: ID does not exist" containerID="e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b" Apr 24 19:17:16.146971 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.146912 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b"} err="failed to get container status \"e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b\": rpc error: code = NotFound desc = could not find container \"e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b\": container with ID starting with e878b95d5a17a74fe1f79fb85c16e9a5fd3370479ec79d249936ed524b10a93b not found: ID does not exist" Apr 24 19:17:16.158775 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.158750 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-764fd8f9f6-bwnwc"] Apr 24 19:17:16.162430 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:16.162411 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-764fd8f9f6-bwnwc"] Apr 24 19:17:17.120267 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:17:17.120230 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186b59ca-f095-423a-a635-46c8087c0242" path="/var/lib/kubelet/pods/186b59ca-f095-423a-a635-46c8087c0242/volumes" Apr 24 19:22:07.053086 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:22:07.053059 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:22:07.055138 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:22:07.055114 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:22:07.059039 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:22:07.059021 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:22:07.060957 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:22:07.060941 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:27:07.078225 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:27:07.078197 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:27:07.081600 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:27:07.081576 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:27:07.083782 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:27:07.083760 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:27:07.086844 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:27:07.086827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:32:07.103283 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:32:07.103252 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:32:07.106552 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:32:07.106531 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:32:07.108533 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:32:07.108510 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:32:07.111911 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:32:07.111894 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:37:07.133397 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:37:07.133367 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:37:07.136848 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:37:07.136823 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:37:07.138439 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:37:07.138421 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:37:07.142320 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:37:07.142302 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:42:07.157971 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:42:07.157932 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:42:07.163198 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:42:07.163176 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:42:07.163696 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:42:07.163677 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:42:07.168628 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:42:07.168597 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:47:07.182329 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:47:07.182298 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:47:07.187671 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:47:07.187650 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:47:07.188637 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:47:07.188600 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:47:07.193762 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:47:07.193744 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:52:07.210061 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:52:07.210029 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:52:07.218603 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:52:07.218581 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:52:07.219209 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:52:07.219191 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:52:07.224072 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:52:07.224049 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:57:07.238788 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:57:07.238755 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:57:07.243881 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:57:07.243853 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 19:57:07.244022 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:57:07.243967 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 19:57:07.252273 ip-10-0-131-214 kubenswrapper[2573]: I0424 19:57:07.252253 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 20:02:07.263525 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:02:07.263495 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 20:02:07.269445 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:02:07.269420 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 20:02:07.273073 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:02:07.273054 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 20:02:07.278874 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:02:07.278854 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 20:07:07.289739 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:07:07.289711 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 20:07:07.294996 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:07:07.294973 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 20:07:07.298574 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:07:07.298552 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 20:07:07.304499 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:07:07.304478 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 20:12:07.313674 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:12:07.313642 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 20:12:07.319196 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:12:07.319177 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 20:12:07.324665 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:12:07.324645 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 20:12:07.330281 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:12:07.330259 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 20:13:35.334416 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:35.334385 2573 ???:1] "http: TLS handshake error from 10.0.129.23:58684: EOF" Apr 24 20:13:35.340968 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:35.340945 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xf9r2_6f8ef441-4de5-492e-9e4a-1c61639cde69/global-pull-secret-syncer/0.log" Apr 24 20:13:35.531224 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:35.531195 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lgp86_bb7f420a-f722-43ae-b31f-4de4b069fe5c/konnectivity-agent/0.log" Apr 24 20:13:35.609668 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:35.609587 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-214.ec2.internal_3f5616a56b4d82e03f666685a7c47e3f/haproxy/0.log" Apr 24 20:13:38.930984 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:38.930954 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c4e19-0034-4eca-8468-efb2b6b708e4/alertmanager/0.log" Apr 24 20:13:38.961087 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:38.961061 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c4e19-0034-4eca-8468-efb2b6b708e4/config-reloader/0.log" Apr 24 20:13:38.985248 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:38.985227 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c4e19-0034-4eca-8468-efb2b6b708e4/kube-rbac-proxy-web/0.log" Apr 24 20:13:39.015049 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.014981 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c4e19-0034-4eca-8468-efb2b6b708e4/kube-rbac-proxy/0.log" Apr 24 20:13:39.043706 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.043683 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c4e19-0034-4eca-8468-efb2b6b708e4/kube-rbac-proxy-metric/0.log" Apr 24 20:13:39.068168 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.068143 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c4e19-0034-4eca-8468-efb2b6b708e4/prom-label-proxy/0.log" Apr 24 20:13:39.096691 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.096656 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c4e19-0034-4eca-8468-efb2b6b708e4/init-config-reloader/0.log" Apr 24 20:13:39.142623 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.142576 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4lvn4_36d3fa5b-8237-4810-9120-a6a9421e039b/cluster-monitoring-operator/0.log" Apr 24 20:13:39.247090 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.247062 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-f75f4c864-nlvk8_14a50468-3433-4a11-a3a5-017511ea1ead/metrics-server/0.log" Apr 24 20:13:39.274035 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.273964 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-fghl5_a82aef2a-3335-40f6-8483-d1fd1479a47a/monitoring-plugin/0.log" Apr 24 20:13:39.450992 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.450954 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v2jhm_a0a60b9f-2691-420d-8541-a3d6737868b5/node-exporter/0.log" Apr 24 20:13:39.472272 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.472246 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v2jhm_a0a60b9f-2691-420d-8541-a3d6737868b5/kube-rbac-proxy/0.log" Apr 24 20:13:39.494585 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.494559 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v2jhm_a0a60b9f-2691-420d-8541-a3d6737868b5/init-textfile/0.log" Apr 24 20:13:39.525665 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.525581 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n4gvt_53275301-bce9-4425-9006-0998dc291f4f/kube-rbac-proxy-main/0.log" Apr 24 20:13:39.548888 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.548861 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n4gvt_53275301-bce9-4425-9006-0998dc291f4f/kube-rbac-proxy-self/0.log" Apr 24 20:13:39.576155 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.576131 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n4gvt_53275301-bce9-4425-9006-0998dc291f4f/openshift-state-metrics/0.log" Apr 24 20:13:39.843486 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.843409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-p2bjg_342fbf06-1cb6-4b01-abb5-6eda8b2456eb/prometheus-operator-admission-webhook/0.log" Apr 24 20:13:39.877775 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.877748 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56bb55b79d-kdvgh_25a2f786-c145-4f81-9f1e-39b38e46a058/telemeter-client/0.log" Apr 24 20:13:39.902514 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.902491 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56bb55b79d-kdvgh_25a2f786-c145-4f81-9f1e-39b38e46a058/reload/0.log" Apr 24 20:13:39.925842 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.925818 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56bb55b79d-kdvgh_25a2f786-c145-4f81-9f1e-39b38e46a058/kube-rbac-proxy/0.log" Apr 24 20:13:39.965777 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.965753 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd9f79f69-q7q9c_5c18ab70-91cb-4217-b088-7eadecdfc842/thanos-query/0.log" Apr 24 20:13:39.998758 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:39.998737 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd9f79f69-q7q9c_5c18ab70-91cb-4217-b088-7eadecdfc842/kube-rbac-proxy-web/0.log" Apr 24 20:13:40.033896 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:40.033868 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd9f79f69-q7q9c_5c18ab70-91cb-4217-b088-7eadecdfc842/kube-rbac-proxy/0.log" Apr 24 20:13:40.069438 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:40.069410 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd9f79f69-q7q9c_5c18ab70-91cb-4217-b088-7eadecdfc842/prom-label-proxy/0.log" Apr 24 20:13:40.099591 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:40.099525 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd9f79f69-q7q9c_5c18ab70-91cb-4217-b088-7eadecdfc842/kube-rbac-proxy-rules/0.log" Apr 24 20:13:40.135774 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:40.135743 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd9f79f69-q7q9c_5c18ab70-91cb-4217-b088-7eadecdfc842/kube-rbac-proxy-metrics/0.log" Apr 24 20:13:41.636706 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:41.636650 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/2.log" Apr 24 20:13:41.641257 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:41.641235 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-545tt_48af1cfa-4ff5-4543-8939-d43ac71b40ad/console-operator/3.log" Apr 24 20:13:42.004027 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.003999 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85d7f4d7c9-8mw2j_82c7a3b7-4f81-44c2-b03f-394040e706c8/console/0.log" Apr 24 20:13:42.041331 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.041305 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-s5jkh_5ecc46dd-f655-4f16-9e14-5494a657924a/download-server/0.log" Apr 24 20:13:42.651644 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.651594 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c"] Apr 24 20:13:42.652027 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.651969 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="186b59ca-f095-423a-a635-46c8087c0242" containerName="console" Apr 24 20:13:42.652027 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.651980 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="186b59ca-f095-423a-a635-46c8087c0242" containerName="console" Apr 24 20:13:42.652096 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.652064 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="186b59ca-f095-423a-a635-46c8087c0242" containerName="console" Apr 24 20:13:42.655238 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.655222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.657573 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.657551 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tkz2z\"/\"kube-root-ca.crt\"" Apr 24 20:13:42.657707 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.657551 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tkz2z\"/\"openshift-service-ca.crt\"" Apr 24 20:13:42.658662 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.658639 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tkz2z\"/\"default-dockercfg-k94xx\"" Apr 24 20:13:42.660404 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.660382 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c"] Apr 24 20:13:42.750971 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.750930 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-sys\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.750971 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.750976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-proc\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.751190 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.750998 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58t85\" (UniqueName: \"kubernetes.io/projected/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-kube-api-access-58t85\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.751190 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.751102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-podres\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.751190 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.751149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-lib-modules\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.852656 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-sys\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.852840 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-proc\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.852840 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58t85\" (UniqueName: \"kubernetes.io/projected/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-kube-api-access-58t85\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.852840 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852714 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-sys\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.852840 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-podres\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.852840 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-proc\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.852840 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-lib-modules\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.853081 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-podres\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.853081 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.852946 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-lib-modules\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.860149 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.860117 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58t85\" (UniqueName: \"kubernetes.io/projected/19940cba-6c60-45bc-8b82-c0ea0bfc6fe0-kube-api-access-58t85\") pod \"perf-node-gather-daemonset-cgg6c\" (UID: \"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:42.966994 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:42.966952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:43.069796 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.069766 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fgzls_c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed/dns/0.log" Apr 24 20:13:43.097329 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.097293 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fgzls_c3fa5dcb-318b-41df-bc88-8b9ba1e9d4ed/kube-rbac-proxy/0.log" Apr 24 20:13:43.102547 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.102522 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c"] Apr 24 20:13:43.106266 ip-10-0-131-214 kubenswrapper[2573]: W0424 20:13:43.106230 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod19940cba_6c60_45bc_8b82_c0ea0bfc6fe0.slice/crio-5504a117b34b8e244e5ea2098ce22b3393c8c486af9861a34f5cd72fb3a94970 WatchSource:0}: Error finding container 5504a117b34b8e244e5ea2098ce22b3393c8c486af9861a34f5cd72fb3a94970: Status 404 returned error can't find the container with id 5504a117b34b8e244e5ea2098ce22b3393c8c486af9861a34f5cd72fb3a94970 Apr 24 20:13:43.107830 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.107809 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 20:13:43.225140 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.225063 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bk6r2_79c1d341-2bed-41fe-b49c-3f1de4604feb/dns-node-resolver/0.log" Apr 24 20:13:43.700359 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.700316 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" event={"ID":"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0","Type":"ContainerStarted","Data":"de67b3b046644c8e65de6a959fecdfa1993f0ed8d08fc6c5d34ac82b256f690d"} Apr 24 20:13:43.700359 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.700363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" event={"ID":"19940cba-6c60-45bc-8b82-c0ea0bfc6fe0","Type":"ContainerStarted","Data":"5504a117b34b8e244e5ea2098ce22b3393c8c486af9861a34f5cd72fb3a94970"} Apr 24 20:13:43.700843 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.700405 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:43.717716 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.717660 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" podStartSLOduration=1.717637892 podStartE2EDuration="1.717637892s" podCreationTimestamp="2026-04-24 20:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:13:43.717058569 +0000 UTC m=+3997.187218097" watchObservedRunningTime="2026-04-24 20:13:43.717637892 +0000 UTC m=+3997.187797421" Apr 24 20:13:43.725023 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:43.724998 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2p82x_9e31045f-47cd-4c8f-bb8b-a6a36b6bdb92/node-ca/0.log" Apr 24 20:13:44.514144 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:44.514121 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54854595f4-ljrqx_510e887c-a3b0-44b9-b21c-9137b720d224/router/0.log" Apr 24 20:13:44.826952 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:44.826873 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-d8rzc_429311fb-ef10-40c4-958e-de80bbde38f2/serve-healthcheck-canary/0.log" Apr 24 20:13:45.387542 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:45.387512 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ncvw4_0844d470-2d66-44fe-8ddb-05c8d01dc2c8/kube-rbac-proxy/0.log" Apr 24 20:13:45.409354 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:45.409323 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ncvw4_0844d470-2d66-44fe-8ddb-05c8d01dc2c8/exporter/0.log" Apr 24 20:13:45.430894 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:45.430863 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ncvw4_0844d470-2d66-44fe-8ddb-05c8d01dc2c8/extractor/0.log" Apr 24 20:13:47.445816 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:47.445778 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-8cdbbc8b5-mzs94_175a576a-5ae9-4449-94e2-ce494e7056bf/manager/0.log" Apr 24 20:13:47.533115 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:47.533088 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-dh96v_80404089-e731-4d12-82e8-02e3a6a9f8e0/manager/0.log" Apr 24 20:13:49.714641 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:49.714601 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-cgg6c" Apr 24 20:13:52.063150 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:52.063052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vjxtr_d40c19b7-e7a8-4514-af70-6b73c6866411/migrator/0.log" Apr 24 20:13:52.084885 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:52.084858 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vjxtr_d40c19b7-e7a8-4514-af70-6b73c6866411/graceful-termination/0.log" Apr 24 20:13:53.326555 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:53.326520 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zr6t_910be432-3c6b-4796-b69f-ec249fce39e9/kube-multus/0.log" Apr 24 20:13:53.724003 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:53.723973 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhc28_6f5ee354-6f03-4d9b-8ef6-57c8988d266c/kube-multus-additional-cni-plugins/0.log" Apr 24 20:13:53.754203 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:53.754173 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhc28_6f5ee354-6f03-4d9b-8ef6-57c8988d266c/egress-router-binary-copy/0.log" Apr 24 20:13:53.805324 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:53.805296 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhc28_6f5ee354-6f03-4d9b-8ef6-57c8988d266c/cni-plugins/0.log" Apr 24 20:13:53.859189 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:53.859119 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhc28_6f5ee354-6f03-4d9b-8ef6-57c8988d266c/bond-cni-plugin/0.log" Apr 24 20:13:53.895649 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:53.895622 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhc28_6f5ee354-6f03-4d9b-8ef6-57c8988d266c/routeoverride-cni/0.log" Apr 24 20:13:53.916822 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:53.916797 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhc28_6f5ee354-6f03-4d9b-8ef6-57c8988d266c/whereabouts-cni-bincopy/0.log" Apr 24 20:13:53.937848 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:53.937816 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhc28_6f5ee354-6f03-4d9b-8ef6-57c8988d266c/whereabouts-cni/0.log" Apr 24 20:13:54.039174 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:54.039135 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cr4ls_8a81da49-19b8-407f-a961-d85a0ec045e1/network-metrics-daemon/0.log" Apr 24 20:13:54.061469 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:54.061441 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cr4ls_8a81da49-19b8-407f-a961-d85a0ec045e1/kube-rbac-proxy/0.log" Apr 24 20:13:55.271932 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.271903 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-controller/0.log" Apr 24 20:13:55.290507 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.290472 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/0.log" Apr 24 20:13:55.307600 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.307574 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovn-acl-logging/1.log" Apr 24 20:13:55.327592 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.327572 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/kube-rbac-proxy-node/0.log" Apr 24 20:13:55.353163 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.353141 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 20:13:55.374671 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.374652 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/northd/0.log" Apr 24 20:13:55.396906 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.396882 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/nbdb/0.log" Apr 24 20:13:55.419050 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.419029 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/sbdb/0.log" Apr 24 20:13:55.515997 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:55.515970 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k6tsb_d25f07b3-9bc3-4e49-ad47-406fe9d7e1da/ovnkube-controller/0.log" Apr 24 20:13:56.870936 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:56.870906 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-lh97m_9a5a78d0-d426-447b-ab33-a09e0d9966e1/check-endpoints/0.log" Apr 24 20:13:56.960888 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:56.960861 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-r98mn_fd984433-e82d-4be9-964f-829123c5bb26/network-check-target-container/0.log" Apr 24 20:13:57.877221 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:57.877190 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-w2hgp_decb2c4f-4ec7-44bc-abba-863c45b63162/iptables-alerter/0.log" Apr 24 20:13:58.549656 ip-10-0-131-214 kubenswrapper[2573]: I0424 20:13:58.549626 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-pr4b4_73d34880-7e47-4c3a-869e-7a929e328a13/tuned/0.log"