Apr 23 08:14:37.948544 ip-10-0-134-8 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:14:38.407515 ip-10-0-134-8 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:38.407515 ip-10-0-134-8 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:14:38.407515 ip-10-0-134-8 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:38.407515 ip-10-0-134-8 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:14:38.407515 ip-10-0-134-8 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:38.410196 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.410106 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:14:38.413060 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413044 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:38.413060 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413059 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413063 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413067 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413070 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413073 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413076 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413080 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413083 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413086 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413089 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413091 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413094 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413097 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413099 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413102 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413105 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413107 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413119 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413122 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:38.413129 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413124 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413126 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413129 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413132 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413135 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413138 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413140 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413143 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413148 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413151 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413154 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413157 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413159 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413162 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413165 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413168 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413171 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413173 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413176 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:38.413599 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413178 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413181 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413183 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413186 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413188 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413191 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413200 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413203 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413205 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413208 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413210 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413212 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413221 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413224 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413226 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413229 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413232 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413235 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413237 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413240 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:38.414083 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413243 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413246 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413249 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413252 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413256 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413273 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413276 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413278 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413281 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413284 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413286 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413289 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413292 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413295 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413297 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413300 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413303 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413305 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413309 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413312 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413314 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:38.414617 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413317 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413319 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413322 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413324 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413333 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413337 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413742 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413747 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413750 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413752 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413755 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413758 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413761 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413764 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413766 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413769 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413772 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413774 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413777 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413779 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:38.415132 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413782 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413784 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413787 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413790 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413792 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413794 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413797 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413799 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413802 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413804 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413807 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413809 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413812 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413814 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413817 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413819 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413826 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413831 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413835 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:38.415627 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413838 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413840 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413843 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413846 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413848 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413851 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413853 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413856 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413858 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413861 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413864 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413866 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413868 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413871 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413873 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413876 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413879 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413881 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413884 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413886 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:38.416089 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413889 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413891 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413894 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413896 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413898 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413901 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413904 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413907 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413911 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413914 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413918 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413921 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413924 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413927 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413930 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413932 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413935 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413938 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413940 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413943 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:38.416586 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413946 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413949 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413951 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413954 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413956 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413959 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413961 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413964 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413967 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413969 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413972 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413974 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.413977 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414706 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414720 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414727 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414732 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414736 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414740 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414745 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414749 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:14:38.417092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414752 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414756 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414760 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414764 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414767 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414770 2561 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414773 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414775 2561 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414778 2561 flags.go:64] FLAG: --cloud-config="" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414781 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414784 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414788 2561 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414791 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414794 2561 flags.go:64] FLAG: --config-dir="" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414797 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414800 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414804 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414807 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414810 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414813 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414815 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414818 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414821 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414825 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414827 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:14:38.417622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414832 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414834 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414837 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414840 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414843 2561 flags.go:64] FLAG: --enable-server="true" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414846 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414850 2561 flags.go:64] FLAG: --event-burst="100" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414853 2561 flags.go:64] FLAG: --event-qps="50" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414856 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414860 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414866 2561 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414870 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414873 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414876 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414879 2561 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414882 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414886 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414889 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414891 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414894 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414897 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414900 2561 flags.go:64] FLAG: --feature-gates="" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414904 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414908 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414910 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:14:38.418219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414914 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414917 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414920 2561 flags.go:64] FLAG: --help="false" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414922 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414925 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414928 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414931 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414934 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414937 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414940 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414943 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414946 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414949 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414951 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414955 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414958 2561 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414962 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414965 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414975 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414978 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414981 2561 flags.go:64] FLAG: --lock-file="" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414984 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414987 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414990 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:14:38.418834 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414995 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.414998 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415001 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415004 2561 flags.go:64] FLAG: --logging-format="text" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415006 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415010 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415013 2561 flags.go:64] FLAG: --manifest-url="" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415016 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415020 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415023 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415027 2561 flags.go:64] FLAG: --max-pods="110" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415030 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415033 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415036 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415039 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415042 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415044 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415047 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415055 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415058 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415061 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415064 2561 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415067 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:14:38.419464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415073 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415076 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415079 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415082 2561 flags.go:64] FLAG: --port="10250" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415086 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415089 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0dcb22b0a2d2dec54" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415092 2561 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415095 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415098 2561 flags.go:64] FLAG: --register-node="true" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415101 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415104 2561 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415108 2561 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415110 2561 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415113 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415116 2561 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415120 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415123 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415126 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415128 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415131 2561 flags.go:64] FLAG: --runonce="false" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415134 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415137 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415141 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415144 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415147 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415150 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:14:38.420048 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415153 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415156 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415158 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415161 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415164 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415167 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415171 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415174 2561 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415177 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415182 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415185 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415188 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415193 2561 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415196 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415198 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415201 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415204 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415207 2561 flags.go:64] FLAG: --v="2" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415211 2561 flags.go:64] FLAG: --version="false" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415215 2561 flags.go:64] FLAG: --vmodule="" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415220 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.415227 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415333 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415338 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:38.420685 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415341 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415344 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415347 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415350 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415353 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415356 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415359 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415362 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415365 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415367 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415370 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415373 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415375 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415377 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415380 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415383 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415386 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415388 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415391 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415394 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:38.421312 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415399 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415402 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415405 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415407 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415410 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415413 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415415 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415418 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415420 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415424 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415427 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415430 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415432 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415435 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415438 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415443 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415445 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415448 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415451 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415453 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:38.421819 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415456 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415458 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415461 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415463 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415467 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415470 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415472 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415475 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415478 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415480 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415483 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415485 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415488 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415497 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415499 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415502 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415504 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415507 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415509 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415511 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:38.422338 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415515 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415520 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415523 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415525 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415528 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415530 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415533 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415535 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415538 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415540 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415543 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415545 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415548 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415550 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415553 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415555 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415559 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415562 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415564 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415567 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:38.423113 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415569 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:38.423643 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415571 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:38.423643 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415574 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:38.423643 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.415578 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:38.423643 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.416320 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:38.424526 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.424503 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:14:38.424565 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.424528 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:14:38.424598 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424576 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:38.424598 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424582 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:38.424598 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424585 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:38.424598 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424589 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:38.424598 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424592 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:38.424598 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424595 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:38.424598 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424598 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:38.424598 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424600 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424604 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424608 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424613 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424616 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424619 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424622 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424625 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424628 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424630 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424633 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424636 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424639 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424642 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424645 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424647 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424650 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424652 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424655 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424658 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:38.424794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424660 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424663 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424665 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424668 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424671 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424673 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424676 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424678 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424681 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424683 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424685 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424688 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424691 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424693 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424697 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424699 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424703 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424705 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424708 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424711 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:38.425321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424713 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424717 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424721 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424724 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424726 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424729 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424732 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424734 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424737 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424740 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424742 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424745 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424747 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424750 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424752 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424755 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424758 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424761 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424763 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424766 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:38.425807 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424768 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424771 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424773 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424776 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424778 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424781 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424784 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424787 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424789 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424792 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424795 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424798 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424801 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424804 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424807 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424809 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424812 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424814 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:38.426334 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424817 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.424822 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424912 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424917 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424920 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424923 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424926 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424928 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424931 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424934 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424936 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424938 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424941 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424943 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424946 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424948 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:38.426769 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424951 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424953 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424956 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424959 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424962 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424964 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424967 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424970 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424972 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424975 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424978 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424981 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424983 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424985 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424988 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424990 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424993 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424995 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.424998 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425000 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425003 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:38.427160 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425006 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425008 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425010 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425013 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425015 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425018 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425046 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425051 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425053 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425056 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425059 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425063 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425067 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425070 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425073 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425076 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425079 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425081 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425084 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425087 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:38.427681 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425091 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425093 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425097 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425100 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425103 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425106 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425109 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425111 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425114 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425116 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425119 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425121 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425124 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425126 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425129 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425132 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425134 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425137 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425139 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:38.428166 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425142 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425145 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425147 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425150 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425152 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425155 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425157 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425160 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425162 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425165 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425168 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:38.425170 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.425175 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:38.428657 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.425934 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:14:38.430467 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.430454 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:14:38.431474 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.431462 2561 server.go:1019] "Starting client certificate rotation" Apr 23 08:14:38.431576 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.431558 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:14:38.431632 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.431610 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:14:38.456980 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.456960 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:14:38.459805 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.459783 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:14:38.477476 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.477461 2561 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:14:38.484584 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.484533 2561 log.go:25] "Validated CRI v1 image API" Apr 23 08:14:38.485974 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.485959 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:14:38.490059 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.490037 2561 fs.go:135] Filesystem UUIDs: map[3e4d76f1-b2b2-4b7b-8f28-016983c4d28d:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c161ee2c-6442-4bb2-b94c-bf74d9319e28:/dev/nvme0n1p3] Apr 23 08:14:38.490127 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.490058 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:14:38.492036 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.492019 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:14:38.495502 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.495395 2561 manager.go:217] Machine: {Timestamp:2026-04-23 08:14:38.493723825 +0000 UTC m=+0.425086617 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100439 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25f8ac53db5909935eb90ada60b767 SystemUUID:ec25f8ac-53db-5909-935e-b90ada60b767 BootID:29b33cfe-e6ba-4365-bb0e-e5cd74dfa1ec Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6a:bd:0c:73:c1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6a:bd:0c:73:c1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:39:ab:73:5e:13 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:14:38.495502 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.495496 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:14:38.495622 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.495571 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:14:38.497458 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.497433 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:14:38.497613 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.497460 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-8.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:14:38.497687 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.497632 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:14:38.497687 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.497652 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:14:38.497687 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.497672 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:14:38.498448 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.498435 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:14:38.499298 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.499286 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:14:38.499427 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.499416 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:14:38.501822 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.501811 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:14:38.501886 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.501828 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:14:38.501886 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.501845 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:14:38.501886 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.501859 2561 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:14:38.501886 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.501871 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:14:38.503274 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.503251 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:14:38.503334 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.503290 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:14:38.506365 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.506347 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:14:38.507729 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.507715 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:14:38.509546 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509533 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:14:38.509590 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509555 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:14:38.509590 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509565 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:14:38.509590 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509574 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:14:38.509673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509593 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:14:38.509673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509601 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:14:38.509673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509608 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:14:38.509673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509614 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:14:38.509673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509621 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:14:38.509673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509628 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:14:38.509673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509645 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:14:38.509673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.509655 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:14:38.510561 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.510550 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:14:38.510619 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.510564 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:14:38.511387 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.511365 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pzpqz" Apr 23 08:14:38.512409 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.512385 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:14:38.512467 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.512448 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-8.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:14:38.514671 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.514657 2561 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-8.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:14:38.514897 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.514888 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:14:38.514933 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.514923 2561 server.go:1295] "Started kubelet" Apr 23 08:14:38.515019 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.514997 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:14:38.515096 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.515061 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:14:38.515146 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.515116 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:14:38.515781 ip-10-0-134-8 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:14:38.516615 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.516598 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:14:38.517847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.517832 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:14:38.521194 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.521167 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pzpqz" Apr 23 08:14:38.521572 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.521553 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:14:38.522945 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.522927 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:14:38.524053 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.523973 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:14:38.524341 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.524145 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:14:38.524562 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.524546 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:14:38.524643 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.524623 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:14:38.524643 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.524632 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:14:38.524798 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.523440 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-8.ec2.internal.18a8ee4f91e0834f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-8.ec2.internal,UID:ip-10-0-134-8.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-8.ec2.internal,},FirstTimestamp:2026-04-23 08:14:38.514897743 +0000 UTC m=+0.446260535,LastTimestamp:2026-04-23 08:14:38.514897743 +0000 UTC m=+0.446260535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-8.ec2.internal,}" Apr 23 08:14:38.525087 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.525048 2561 factory.go:55] Registering systemd factory Apr 23 08:14:38.525087 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.525079 2561 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:14:38.525308 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.525252 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:38.525405 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.525389 2561 factory.go:153] Registering CRI-O factory Apr 23 08:14:38.525405 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.525407 2561 factory.go:223] Registration of the crio container factory successfully Apr 23 08:14:38.525537 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.525459 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:14:38.525537 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.525490 2561 factory.go:103] Registering Raw factory Apr 23 08:14:38.525537 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.525506 2561 manager.go:1196] Started watching for new ooms in manager Apr 23 08:14:38.526220 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.525940 2561 manager.go:319] Starting recovery of all containers Apr 23 08:14:38.528500 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.528477 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:14:38.533313 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.533142 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:38.536282 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.536247 2561 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-8.ec2.internal\" not found" node="ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.537089 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.537076 2561 manager.go:324] Recovery completed Apr 23 08:14:38.541062 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.541050 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:38.543337 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.543322 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:38.543404 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.543349 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:38.543404 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.543360 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:38.543803 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.543789 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:14:38.543803 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.543799 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:14:38.543888 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.543815 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:14:38.546007 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.545990 2561 policy_none.go:49] "None policy: Start" Apr 23 08:14:38.546077 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.546015 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:14:38.546077 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.546029 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:14:38.578861 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.578844 2561 manager.go:341] "Starting Device Plugin manager" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.578870 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.578879 2561 server.go:85] "Starting device plugin registration server" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.579111 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.579123 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.579209 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.579313 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.579321 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.580833 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:14:38.592579 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.580870 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:38.658411 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.658323 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:14:38.659573 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.659554 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:14:38.659632 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.659590 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:14:38.659632 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.659612 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:14:38.659632 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.659621 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:14:38.659730 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.659662 2561 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:14:38.663874 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.663856 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:38.680198 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.680171 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:38.680922 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.680908 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:38.681004 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.680934 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:38.681004 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.680945 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:38.681004 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.680968 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.689420 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.689404 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.689504 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.689428 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-8.ec2.internal\": node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:38.712226 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.712205 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:38.760344 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.760318 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal"] Apr 23 08:14:38.760413 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.760384 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:38.761939 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.761924 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:38.762001 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.761951 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:38.762001 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.761961 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:38.763145 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.763134 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:38.763378 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.763358 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.763417 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.763394 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:38.764317 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.764301 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:38.764409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.764327 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:38.764409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.764350 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:38.764409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.764364 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:38.764409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.764330 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:38.764594 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.764435 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:38.765672 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.765657 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.765748 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.765689 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:38.766419 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.766402 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:38.766509 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.766423 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:38.766509 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.766432 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:38.788813 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.788788 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-8.ec2.internal\" not found" node="ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.793008 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.792992 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-8.ec2.internal\" not found" node="ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.812532 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.812517 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:38.826352 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.826333 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4a0c7941d424e31aadfd07308d6e5c7b-config\") pod \"kube-apiserver-proxy-ip-10-0-134-8.ec2.internal\" (UID: \"4a0c7941d424e31aadfd07308d6e5c7b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.826417 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.826358 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/327ce7e2a793cf29ed8c9455dfd4f163-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal\" (UID: \"327ce7e2a793cf29ed8c9455dfd4f163\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.826417 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.826381 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/327ce7e2a793cf29ed8c9455dfd4f163-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal\" (UID: \"327ce7e2a793cf29ed8c9455dfd4f163\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.913012 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:38.912932 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:38.926918 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.926896 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4a0c7941d424e31aadfd07308d6e5c7b-config\") pod \"kube-apiserver-proxy-ip-10-0-134-8.ec2.internal\" (UID: \"4a0c7941d424e31aadfd07308d6e5c7b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.927023 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.926923 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/327ce7e2a793cf29ed8c9455dfd4f163-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal\" (UID: \"327ce7e2a793cf29ed8c9455dfd4f163\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.927023 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.926939 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/327ce7e2a793cf29ed8c9455dfd4f163-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal\" (UID: \"327ce7e2a793cf29ed8c9455dfd4f163\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.927023 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.926977 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/327ce7e2a793cf29ed8c9455dfd4f163-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal\" (UID: \"327ce7e2a793cf29ed8c9455dfd4f163\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.927023 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.926995 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4a0c7941d424e31aadfd07308d6e5c7b-config\") pod \"kube-apiserver-proxy-ip-10-0-134-8.ec2.internal\" (UID: \"4a0c7941d424e31aadfd07308d6e5c7b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" Apr 23 08:14:38.927023 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:38.927005 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/327ce7e2a793cf29ed8c9455dfd4f163-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal\" (UID: \"327ce7e2a793cf29ed8c9455dfd4f163\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:39.013324 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.013293 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.090844 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.090818 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:39.095331 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.095315 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" Apr 23 08:14:39.113932 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.113908 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.214508 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.214476 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.314998 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.314965 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.415517 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.415489 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.430942 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.430927 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:14:39.431062 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.431045 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:14:39.431119 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.431087 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:14:39.515803 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.515739 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.520649 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.520626 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:39.523522 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.523502 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:14:39.524523 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.524498 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:09:38 +0000 UTC" deadline="2027-10-02 12:30:35.161860416 +0000 UTC" Apr 23 08:14:39.524621 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.524522 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12652h15m55.637340477s" Apr 23 08:14:39.539372 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.539345 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:14:39.616708 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.616677 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.622468 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.622449 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xm8rm" Apr 23 08:14:39.630913 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.630894 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xm8rm" Apr 23 08:14:39.651087 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:39.651038 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a0c7941d424e31aadfd07308d6e5c7b.slice/crio-088d92381c78e57aa3a93fdc08eeda2600a2ae8dc82656825c96e1d10a9ce324 WatchSource:0}: Error finding container 088d92381c78e57aa3a93fdc08eeda2600a2ae8dc82656825c96e1d10a9ce324: Status 404 returned error can't find the container with id 088d92381c78e57aa3a93fdc08eeda2600a2ae8dc82656825c96e1d10a9ce324 Apr 23 08:14:39.651476 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:39.651457 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod327ce7e2a793cf29ed8c9455dfd4f163.slice/crio-2fc48ca816580f42aac5ac220586e2d9cff8b5d39ef1e73e4f698b2e8611e190 WatchSource:0}: Error finding container 2fc48ca816580f42aac5ac220586e2d9cff8b5d39ef1e73e4f698b2e8611e190: Status 404 returned error can't find the container with id 2fc48ca816580f42aac5ac220586e2d9cff8b5d39ef1e73e4f698b2e8611e190 Apr 23 08:14:39.655257 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.655245 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:14:39.662106 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.662066 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" event={"ID":"4a0c7941d424e31aadfd07308d6e5c7b","Type":"ContainerStarted","Data":"088d92381c78e57aa3a93fdc08eeda2600a2ae8dc82656825c96e1d10a9ce324"} Apr 23 08:14:39.663333 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.663315 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" event={"ID":"327ce7e2a793cf29ed8c9455dfd4f163","Type":"ContainerStarted","Data":"2fc48ca816580f42aac5ac220586e2d9cff8b5d39ef1e73e4f698b2e8611e190"} Apr 23 08:14:39.717299 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.717271 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.817839 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.817761 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.918337 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:39.918308 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-8.ec2.internal\" not found" Apr 23 08:14:39.971386 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:39.971361 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:40.023855 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.023831 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" Apr 23 08:14:40.031568 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.031548 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:14:40.032443 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.032432 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" Apr 23 08:14:40.041992 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.041976 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:14:40.503317 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.503288 2561 apiserver.go:52] "Watching apiserver" Apr 23 08:14:40.512971 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.512948 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:14:40.514305 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.514276 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-zngnf","openshift-dns/node-resolver-vdfxl","openshift-image-registry/node-ca-86wvz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal","openshift-network-diagnostics/network-check-target-mbfqt","openshift-network-operator/iptables-alerter-l76bb","openshift-ovn-kubernetes/ovnkube-node-v5wkc","kube-system/konnectivity-agent-mht6n","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts","openshift-multus/multus-additional-cni-plugins-hg44l","openshift-multus/multus-jgxcr","openshift-multus/network-metrics-daemon-pmv55","kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal"] Apr 23 08:14:40.517187 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.517162 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.520254 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.520202 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:14:40.520536 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.520520 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:14:40.520620 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.520554 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:14:40.520827 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.520812 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:14:40.521062 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.520998 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-swx5j\"" Apr 23 08:14:40.521062 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.521052 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:14:40.522172 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.522152 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.524561 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.524541 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.527067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.526706 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:14:40.527156 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.527134 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gk9j6\"" Apr 23 08:14:40.527532 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.527254 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:14:40.528111 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.528090 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:14:40.529401 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.528590 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:40.529401 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.528896 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k6dm6\"" Apr 23 08:14:40.529401 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:40.529071 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:40.529401 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.529176 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:14:40.529740 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.529723 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:14:40.531110 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.531093 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.531602 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.531585 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.533527 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.533512 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:40.533749 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.533726 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:40.533824 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.533795 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zn7rr\"" Apr 23 08:14:40.533877 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.533841 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:14:40.534386 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.533728 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:14:40.534386 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.534053 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:14:40.534386 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.534326 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:14:40.534386 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.534328 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:14:40.534791 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.534768 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:14:40.534940 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.534925 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t9gsk\"" Apr 23 08:14:40.536023 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536003 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:14:40.536117 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536082 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6rgd8\"" Apr 23 08:14:40.536174 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536154 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-cnibin\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.536227 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536194 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-cni-binary-copy\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.536293 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536224 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmtm\" (UniqueName: \"kubernetes.io/projected/0a8488f0-d2d8-4107-b542-5f46729c4927-kube-api-access-brmtm\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.536293 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536237 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.536293 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536249 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f5d8347-124b-469f-8ac6-0c963d6c4634-host\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.536449 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536333 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvmn\" (UniqueName: \"kubernetes.io/projected/3f5d8347-124b-469f-8ac6-0c963d6c4634-kube-api-access-ntvmn\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.536449 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536373 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-host-slash\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.536449 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536421 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.536591 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536448 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr64q\" (UniqueName: \"kubernetes.io/projected/556cc9f0-a576-455e-b539-83577cba025c-kube-api-access-cr64q\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.536591 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536474 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6dqp\" (UniqueName: \"kubernetes.io/projected/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-kube-api-access-n6dqp\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.536591 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536535 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.536735 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536590 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f5d8347-124b-469f-8ac6-0c963d6c4634-serviceca\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.536735 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536616 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:40.536735 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536643 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-system-cni-dir\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.536735 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536687 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-os-release\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.536735 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536711 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.536735 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536732 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/556cc9f0-a576-455e-b539-83577cba025c-hosts-file\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.537022 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536753 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/556cc9f0-a576-455e-b539-83577cba025c-tmp-dir\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.537022 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536776 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-iptables-alerter-script\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.537022 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.536916 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:14:40.537172 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.537101 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:14:40.537275 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.537239 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:14:40.538673 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.538653 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-m8kzb\"" Apr 23 08:14:40.538838 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.538823 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:14:40.538920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.538852 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:14:40.538999 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.538970 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:14:40.539209 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.539188 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.541216 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.541189 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.541619 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.541543 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:40.541712 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.541692 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z2mkl\"" Apr 23 08:14:40.541758 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.541712 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:14:40.543360 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.543333 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:14:40.543657 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.543641 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-czlf9\"" Apr 23 08:14:40.543969 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.543949 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:40.544055 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:40.544024 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:40.564815 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.564726 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:40.625835 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.625804 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:14:40.631543 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.631510 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:09:39 +0000 UTC" deadline="2027-11-15 10:39:32.453158616 +0000 UTC" Apr 23 08:14:40.631543 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.631536 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13706h24m51.821626446s" Apr 23 08:14:40.637081 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637056 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-run-ovn-kubernetes\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.637182 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637087 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-cni-bin\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.637182 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637114 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b438d86-63e2-4e66-a166-475de69c7900-agent-certs\") pod \"konnectivity-agent-mht6n\" (UID: \"6b438d86-63e2-4e66-a166-475de69c7900\") " pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:40.637182 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637138 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-multus-certs\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.637340 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637188 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-env-overrides\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.637340 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637212 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05731c48-9bfe-46ed-8390-b6d811272383-ovn-node-metrics-cert\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.637340 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637272 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.637340 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637319 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/556cc9f0-a576-455e-b539-83577cba025c-tmp-dir\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.637520 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637404 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-run-netns\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.637520 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637456 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-var-lib-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.637520 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637488 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysconfig\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.637662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637521 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-tuned\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.637662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637546 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-k8s-cni-cncf-io\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.637662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637575 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-cnibin\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.637662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637602 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvmn\" (UniqueName: \"kubernetes.io/projected/3f5d8347-124b-469f-8ac6-0c963d6c4634-kube-api-access-ntvmn\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.637662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637628 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-sys-fs\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.637662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637642 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/556cc9f0-a576-455e-b539-83577cba025c-tmp-dir\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637678 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-cnibin\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637719 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/892bfeb4-76ad-49cf-b615-dfa772b87a7e-cni-binary-copy\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637743 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-netns\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637760 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-cni-multus\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637793 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-hostroot\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637821 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkc2\" (UniqueName: \"kubernetes.io/projected/e92a791e-42ac-4855-b7b5-945f53108891-kube-api-access-hwkc2\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637843 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637865 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-socket-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637880 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-host\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637896 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sssw\" (UniqueName: \"kubernetes.io/projected/2b615e73-dc52-4885-94d7-dc4fecd877f6-kube-api-access-7sssw\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637903 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.637920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637919 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-etc-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637944 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x75hp\" (UniqueName: \"kubernetes.io/projected/05731c48-9bfe-46ed-8390-b6d811272383-kube-api-access-x75hp\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.637965 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-device-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638007 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysctl-d\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638043 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-os-release\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638068 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/556cc9f0-a576-455e-b539-83577cba025c-hosts-file\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638102 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-ovnkube-script-lib\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638125 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-registration-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638147 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-sys\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638140 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-os-release\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638168 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-lib-modules\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638208 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-system-cni-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638223 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/556cc9f0-a576-455e-b539-83577cba025c-hosts-file\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638280 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-daemon-config\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638306 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-host-slash\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638329 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-node-log\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638350 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.638409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638374 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b615e73-dc52-4885-94d7-dc4fecd877f6-tmp\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638389 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-host-slash\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638397 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-conf-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638432 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-etc-kubernetes\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638449 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-run\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638464 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-socket-dir-parent\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638487 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f5d8347-124b-469f-8ac6-0c963d6c4634-serviceca\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638534 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638589 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-ovnkube-config\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638607 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-cni-bin\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638628 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-iptables-alerter-script\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638665 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-kubernetes\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638830 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-systemd\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638840 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f5d8347-124b-469f-8ac6-0c963d6c4634-serviceca\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638867 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-cni-binary-copy\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638888 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f5d8347-124b-469f-8ac6-0c963d6c4634-host\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638912 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-slash\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.639067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.638952 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f5d8347-124b-469f-8ac6-0c963d6c4634-host\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639001 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b438d86-63e2-4e66-a166-475de69c7900-konnectivity-ca\") pod \"konnectivity-agent-mht6n\" (UID: \"6b438d86-63e2-4e66-a166-475de69c7900\") " pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639028 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-modprobe-d\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639052 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-var-lib-kubelet\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639076 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-kubelet\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639104 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639130 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6dqp\" (UniqueName: \"kubernetes.io/projected/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-kube-api-access-n6dqp\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639157 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-systemd-units\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639186 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-systemd\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639210 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639237 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639282 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bz6\" (UniqueName: \"kubernetes.io/projected/892bfeb4-76ad-49cf-b615-dfa772b87a7e-kube-api-access-66bz6\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639292 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-iptables-alerter-script\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639336 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-system-cni-dir\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639368 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-ovn\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639395 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-cni-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.639847 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639400 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-cni-binary-copy\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639419 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-cnibin\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639439 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639445 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639451 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a8488f0-d2d8-4107-b542-5f46729c4927-system-cni-dir\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639474 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brmtm\" (UniqueName: \"kubernetes.io/projected/0a8488f0-d2d8-4107-b542-5f46729c4927-kube-api-access-brmtm\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639508 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-kubelet\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639535 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-cni-netd\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639570 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a8488f0-d2d8-4107-b542-5f46729c4927-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639560 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639606 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-669zr\" (UniqueName: \"kubernetes.io/projected/45bb6f45-fcf6-459d-bd44-766ec463ddb5-kube-api-access-669zr\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639638 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysctl-conf\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639662 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-os-release\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639710 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cr64q\" (UniqueName: \"kubernetes.io/projected/556cc9f0-a576-455e-b539-83577cba025c-kube-api-access-cr64q\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.640307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.639737 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-log-socket\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.643668 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.643649 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:14:40.644512 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:40.644494 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:40.644599 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:40.644516 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:40.644599 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:40.644529 2561 projected.go:194] Error preparing data for projected volume kube-api-access-gkltj for pod openshift-network-diagnostics/network-check-target-mbfqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:40.644699 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:40.644617 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj podName:59f9a0a5-064a-4dd4-9790-0bff108c8fbe nodeName:}" failed. No retries permitted until 2026-04-23 08:14:41.14458046 +0000 UTC m=+3.075943245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gkltj" (UniqueName: "kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj") pod "network-check-target-mbfqt" (UID: "59f9a0a5-064a-4dd4-9790-0bff108c8fbe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:40.647204 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.647155 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvmn\" (UniqueName: \"kubernetes.io/projected/3f5d8347-124b-469f-8ac6-0c963d6c4634-kube-api-access-ntvmn\") pod \"node-ca-86wvz\" (UID: \"3f5d8347-124b-469f-8ac6-0c963d6c4634\") " pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.647662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.647636 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr64q\" (UniqueName: \"kubernetes.io/projected/556cc9f0-a576-455e-b539-83577cba025c-kube-api-access-cr64q\") pod \"node-resolver-vdfxl\" (UID: \"556cc9f0-a576-455e-b539-83577cba025c\") " pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.648430 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.648412 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6dqp\" (UniqueName: \"kubernetes.io/projected/02a74ef7-7607-44c8-9c82-00f2c73ba0e8-kube-api-access-n6dqp\") pod \"iptables-alerter-l76bb\" (UID: \"02a74ef7-7607-44c8-9c82-00f2c73ba0e8\") " pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.651347 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.651329 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmtm\" (UniqueName: \"kubernetes.io/projected/0a8488f0-d2d8-4107-b542-5f46729c4927-kube-api-access-brmtm\") pod \"multus-additional-cni-plugins-hg44l\" (UID: \"0a8488f0-d2d8-4107-b542-5f46729c4927\") " pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.740912 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.740883 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-slash\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741068 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.740917 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b438d86-63e2-4e66-a166-475de69c7900-konnectivity-ca\") pod \"konnectivity-agent-mht6n\" (UID: \"6b438d86-63e2-4e66-a166-475de69c7900\") " pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:40.741068 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.740948 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-modprobe-d\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.741068 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.740968 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-var-lib-kubelet\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.741068 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.740988 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-kubelet\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.741068 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.740999 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-slash\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741068 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741010 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-systemd-units\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741068 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741056 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-systemd-units\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741068 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-systemd\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741097 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741113 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-kubelet\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741129 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66bz6\" (UniqueName: \"kubernetes.io/projected/892bfeb4-76ad-49cf-b615-dfa772b87a7e-kube-api-access-66bz6\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741131 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-var-lib-kubelet\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741133 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-modprobe-d\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741169 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-systemd\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741172 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-ovn\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741191 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-ovn\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741196 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741216 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-cni-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741241 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-cnibin\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741281 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741307 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-kubelet\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741317 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-cni-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741334 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-cnibin\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741331 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-cni-netd\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.741428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741391 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-cni-netd\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:40.741398 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741393 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741393 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-kubelet\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741433 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-669zr\" (UniqueName: \"kubernetes.io/projected/45bb6f45-fcf6-459d-bd44-766ec463ddb5-kube-api-access-669zr\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741440 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741458 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysctl-conf\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:40.741471 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:41.241452446 +0000 UTC m=+3.172815253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741520 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-os-release\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741546 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-log-socket\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741567 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysctl-conf\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741599 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-os-release\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741833 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b438d86-63e2-4e66-a166-475de69c7900-konnectivity-ca\") pod \"konnectivity-agent-mht6n\" (UID: \"6b438d86-63e2-4e66-a166-475de69c7900\") " pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741929 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-run-ovn-kubernetes\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.741994 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-log-socket\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742042 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-cni-bin\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742121 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b438d86-63e2-4e66-a166-475de69c7900-agent-certs\") pod \"konnectivity-agent-mht6n\" (UID: \"6b438d86-63e2-4e66-a166-475de69c7900\") " pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:40.742173 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742164 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-multus-certs\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742201 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-env-overrides\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742226 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05731c48-9bfe-46ed-8390-b6d811272383-ovn-node-metrics-cert\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742282 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-run-netns\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742317 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-var-lib-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742349 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysconfig\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742380 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-tuned\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742408 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-k8s-cni-cncf-io\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742445 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-sys-fs\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742476 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/892bfeb4-76ad-49cf-b615-dfa772b87a7e-cni-binary-copy\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742517 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-netns\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742551 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-cni-multus\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742584 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-hostroot\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742582 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-run-ovn-kubernetes\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742681 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-cni-bin\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742686 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-netns\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742738 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-hostroot\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742779 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-sys-fs\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.742934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.742807 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-cni-multus\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.743672 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.743280 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/892bfeb4-76ad-49cf-b615-dfa772b87a7e-cni-binary-copy\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.743672 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.743345 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-k8s-cni-cncf-io\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.743672 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.743408 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-var-lib-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.743672 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.743463 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-host-run-netns\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.743672 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.743522 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysconfig\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.743672 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.743581 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-run-multus-certs\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.744070 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744020 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-env-overrides\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.744153 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744072 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkc2\" (UniqueName: \"kubernetes.io/projected/e92a791e-42ac-4855-b7b5-945f53108891-kube-api-access-hwkc2\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:40.744153 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744106 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.744153 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744144 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-socket-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.744304 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744172 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-host\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.744304 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744202 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sssw\" (UniqueName: \"kubernetes.io/projected/2b615e73-dc52-4885-94d7-dc4fecd877f6-kube-api-access-7sssw\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.744304 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744231 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-etc-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.744442 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744300 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x75hp\" (UniqueName: \"kubernetes.io/projected/05731c48-9bfe-46ed-8390-b6d811272383-kube-api-access-x75hp\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.744442 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744337 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-device-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.744442 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744380 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysctl-d\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.744442 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744413 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-ovnkube-script-lib\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.744617 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744439 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-registration-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.744617 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744466 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-sys\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.744617 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744494 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-lib-modules\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.744617 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744524 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-system-cni-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.744617 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744554 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-daemon-config\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.744617 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744586 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-node-log\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.744617 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744608 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744636 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b615e73-dc52-4885-94d7-dc4fecd877f6-tmp\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744665 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-conf-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744693 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-etc-kubernetes\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744720 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-run\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744743 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-socket-dir-parent\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744792 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-ovnkube-config\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744821 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-cni-bin\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744846 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-kubernetes\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.744892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.744872 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-systemd\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.745333 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745003 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-system-cni-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.745333 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745045 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-sysctl-d\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.745333 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745250 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-run-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.745486 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745410 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-socket-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.745486 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745470 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-host\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.745577 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745511 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05731c48-9bfe-46ed-8390-b6d811272383-ovn-node-metrics-cert\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.745651 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745625 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-ovnkube-config\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.745762 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745719 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-socket-dir-parent\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.745824 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745780 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05731c48-9bfe-46ed-8390-b6d811272383-ovnkube-script-lib\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.745824 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745781 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-device-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.745824 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745784 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-daemon-config\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.746008 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745834 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.746008 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745863 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-etc-kubernetes\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.746008 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745870 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-kubernetes\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.746008 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745863 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-host-var-lib-cni-bin\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.746008 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.745965 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45bb6f45-fcf6-459d-bd44-766ec463ddb5-registration-dir\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.746252 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.746021 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/892bfeb4-76ad-49cf-b615-dfa772b87a7e-multus-conf-dir\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.746252 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.746034 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-systemd\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.746252 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.746095 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-node-log\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.746252 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.746156 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05731c48-9bfe-46ed-8390-b6d811272383-etc-openvswitch\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.746252 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.746179 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-lib-modules\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.746252 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.746213 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-sys\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.747006 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.746683 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2b615e73-dc52-4885-94d7-dc4fecd877f6-run\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.748181 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.748159 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b615e73-dc52-4885-94d7-dc4fecd877f6-tmp\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.748327 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.748304 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b438d86-63e2-4e66-a166-475de69c7900-agent-certs\") pod \"konnectivity-agent-mht6n\" (UID: \"6b438d86-63e2-4e66-a166-475de69c7900\") " pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:40.748481 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.748439 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2b615e73-dc52-4885-94d7-dc4fecd877f6-etc-tuned\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.749851 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.749831 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bz6\" (UniqueName: \"kubernetes.io/projected/892bfeb4-76ad-49cf-b615-dfa772b87a7e-kube-api-access-66bz6\") pod \"multus-jgxcr\" (UID: \"892bfeb4-76ad-49cf-b615-dfa772b87a7e\") " pod="openshift-multus/multus-jgxcr" Apr 23 08:14:40.753497 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.753443 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-669zr\" (UniqueName: \"kubernetes.io/projected/45bb6f45-fcf6-459d-bd44-766ec463ddb5-kube-api-access-669zr\") pod \"aws-ebs-csi-driver-node-gw5ts\" (UID: \"45bb6f45-fcf6-459d-bd44-766ec463ddb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.753946 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.753852 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x75hp\" (UniqueName: \"kubernetes.io/projected/05731c48-9bfe-46ed-8390-b6d811272383-kube-api-access-x75hp\") pod \"ovnkube-node-v5wkc\" (UID: \"05731c48-9bfe-46ed-8390-b6d811272383\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.754331 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.754246 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sssw\" (UniqueName: \"kubernetes.io/projected/2b615e73-dc52-4885-94d7-dc4fecd877f6-kube-api-access-7sssw\") pod \"tuned-zngnf\" (UID: \"2b615e73-dc52-4885-94d7-dc4fecd877f6\") " pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.754401 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.754382 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkc2\" (UniqueName: \"kubernetes.io/projected/e92a791e-42ac-4855-b7b5-945f53108891-kube-api-access-hwkc2\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:40.829415 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.829380 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hg44l" Apr 23 08:14:40.845143 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.845118 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-86wvz" Apr 23 08:14:40.853748 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.853722 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vdfxl" Apr 23 08:14:40.860314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.860290 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l76bb" Apr 23 08:14:40.869041 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.869017 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:14:40.876626 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.876608 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:40.884110 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.884093 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" Apr 23 08:14:40.889625 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.889607 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zngnf" Apr 23 08:14:40.895141 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:40.895123 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jgxcr" Apr 23 08:14:41.030741 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.030659 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:41.147350 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.147311 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:41.147505 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:41.147487 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:41.147568 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:41.147508 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:41.147568 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:41.147521 2561 projected.go:194] Error preparing data for projected volume kube-api-access-gkltj for pod openshift-network-diagnostics/network-check-target-mbfqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:41.147645 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:41.147585 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj podName:59f9a0a5-064a-4dd4-9790-0bff108c8fbe nodeName:}" failed. No retries permitted until 2026-04-23 08:14:42.147566888 +0000 UTC m=+4.078929680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gkltj" (UniqueName: "kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj") pod "network-check-target-mbfqt" (UID: "59f9a0a5-064a-4dd4-9790-0bff108c8fbe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:41.247813 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.247777 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:41.247972 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:41.247929 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:41.248012 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:41.247991 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:42.24797577 +0000 UTC m=+4.179338549 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:41.413348 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.413318 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05731c48_9bfe_46ed_8390_b6d811272383.slice/crio-c47cf367c474094cb172858c947f6a88001ed308d06f06b01c2fc95298fb234c WatchSource:0}: Error finding container c47cf367c474094cb172858c947f6a88001ed308d06f06b01c2fc95298fb234c: Status 404 returned error can't find the container with id c47cf367c474094cb172858c947f6a88001ed308d06f06b01c2fc95298fb234c Apr 23 08:14:41.417907 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.417879 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod556cc9f0_a576_455e_b539_83577cba025c.slice/crio-d6cac320d052960a8b19ae6e80d659016cc24921b1b1da5b82d35d6b38ddac61 WatchSource:0}: Error finding container d6cac320d052960a8b19ae6e80d659016cc24921b1b1da5b82d35d6b38ddac61: Status 404 returned error can't find the container with id d6cac320d052960a8b19ae6e80d659016cc24921b1b1da5b82d35d6b38ddac61 Apr 23 08:14:41.419361 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.419095 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a8488f0_d2d8_4107_b542_5f46729c4927.slice/crio-1ebffdcda85dbff7b63bb05019e51ce279fff3627dfafaed743b6a005e87c151 WatchSource:0}: Error finding container 1ebffdcda85dbff7b63bb05019e51ce279fff3627dfafaed743b6a005e87c151: Status 404 returned error can't find the container with id 1ebffdcda85dbff7b63bb05019e51ce279fff3627dfafaed743b6a005e87c151 Apr 23 08:14:41.421630 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.421400 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a74ef7_7607_44c8_9c82_00f2c73ba0e8.slice/crio-b4200f42354e1a88ebbc40e63a63eebf76ec6834f327ab9976e622a753fffed0 WatchSource:0}: Error finding container b4200f42354e1a88ebbc40e63a63eebf76ec6834f327ab9976e622a753fffed0: Status 404 returned error can't find the container with id b4200f42354e1a88ebbc40e63a63eebf76ec6834f327ab9976e622a753fffed0 Apr 23 08:14:41.422123 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.422098 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f5d8347_124b_469f_8ac6_0c963d6c4634.slice/crio-eaa0647b944b7d6a2174601710fa5189e3a7d8e833801f1f1fc4939d4450d7bb WatchSource:0}: Error finding container eaa0647b944b7d6a2174601710fa5189e3a7d8e833801f1f1fc4939d4450d7bb: Status 404 returned error can't find the container with id eaa0647b944b7d6a2174601710fa5189e3a7d8e833801f1f1fc4939d4450d7bb Apr 23 08:14:41.423873 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.423597 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b438d86_63e2_4e66_a166_475de69c7900.slice/crio-f66919f84b0b78a8cc748d295fd81e14709ea7de5e3c43d2541e6cfb9a3b9b26 WatchSource:0}: Error finding container f66919f84b0b78a8cc748d295fd81e14709ea7de5e3c43d2541e6cfb9a3b9b26: Status 404 returned error can't find the container with id f66919f84b0b78a8cc748d295fd81e14709ea7de5e3c43d2541e6cfb9a3b9b26 Apr 23 08:14:41.443576 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.443544 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod892bfeb4_76ad_49cf_b615_dfa772b87a7e.slice/crio-c5082da1de13fedb9f7adbd889b200d86d63564b9201bcc6fd3fa98b59fdf469 WatchSource:0}: Error finding container c5082da1de13fedb9f7adbd889b200d86d63564b9201bcc6fd3fa98b59fdf469: Status 404 returned error can't find the container with id c5082da1de13fedb9f7adbd889b200d86d63564b9201bcc6fd3fa98b59fdf469 Apr 23 08:14:41.444564 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.444544 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b615e73_dc52_4885_94d7_dc4fecd877f6.slice/crio-24579baa1f0dbad19b5988406b322e91c6a68445e215d4858dbc267e736fe84d WatchSource:0}: Error finding container 24579baa1f0dbad19b5988406b322e91c6a68445e215d4858dbc267e736fe84d: Status 404 returned error can't find the container with id 24579baa1f0dbad19b5988406b322e91c6a68445e215d4858dbc267e736fe84d Apr 23 08:14:41.445485 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:14:41.445455 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45bb6f45_fcf6_459d_bd44_766ec463ddb5.slice/crio-5e2dba7c4824d389644dd7b0d0d6d10369582b788b587a5a10e5df727ae96a9f WatchSource:0}: Error finding container 5e2dba7c4824d389644dd7b0d0d6d10369582b788b587a5a10e5df727ae96a9f: Status 404 returned error can't find the container with id 5e2dba7c4824d389644dd7b0d0d6d10369582b788b587a5a10e5df727ae96a9f Apr 23 08:14:41.631856 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.631670 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:09:39 +0000 UTC" deadline="2027-11-13 12:41:07.868171902 +0000 UTC" Apr 23 08:14:41.631856 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.631849 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13660h26m26.23632519s" Apr 23 08:14:41.659991 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.659927 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:41.660153 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.659930 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:41.660153 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:41.660017 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:41.660153 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:41.660101 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:41.667473 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.667451 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" event={"ID":"4a0c7941d424e31aadfd07308d6e5c7b","Type":"ContainerStarted","Data":"50695ae0cde9b3d695c5fb84fed7729c982107a6b50293afca515882a26a1b9d"} Apr 23 08:14:41.668455 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.668434 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" event={"ID":"45bb6f45-fcf6-459d-bd44-766ec463ddb5","Type":"ContainerStarted","Data":"5e2dba7c4824d389644dd7b0d0d6d10369582b788b587a5a10e5df727ae96a9f"} Apr 23 08:14:41.669391 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.669368 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-86wvz" event={"ID":"3f5d8347-124b-469f-8ac6-0c963d6c4634","Type":"ContainerStarted","Data":"eaa0647b944b7d6a2174601710fa5189e3a7d8e833801f1f1fc4939d4450d7bb"} Apr 23 08:14:41.670306 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.670283 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vdfxl" event={"ID":"556cc9f0-a576-455e-b539-83577cba025c","Type":"ContainerStarted","Data":"d6cac320d052960a8b19ae6e80d659016cc24921b1b1da5b82d35d6b38ddac61"} Apr 23 08:14:41.671307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.671289 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zngnf" event={"ID":"2b615e73-dc52-4885-94d7-dc4fecd877f6","Type":"ContainerStarted","Data":"24579baa1f0dbad19b5988406b322e91c6a68445e215d4858dbc267e736fe84d"} Apr 23 08:14:41.672152 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.672133 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jgxcr" event={"ID":"892bfeb4-76ad-49cf-b615-dfa772b87a7e","Type":"ContainerStarted","Data":"c5082da1de13fedb9f7adbd889b200d86d63564b9201bcc6fd3fa98b59fdf469"} Apr 23 08:14:41.673133 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.673092 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mht6n" event={"ID":"6b438d86-63e2-4e66-a166-475de69c7900","Type":"ContainerStarted","Data":"f66919f84b0b78a8cc748d295fd81e14709ea7de5e3c43d2541e6cfb9a3b9b26"} Apr 23 08:14:41.674079 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.674061 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l76bb" event={"ID":"02a74ef7-7607-44c8-9c82-00f2c73ba0e8","Type":"ContainerStarted","Data":"b4200f42354e1a88ebbc40e63a63eebf76ec6834f327ab9976e622a753fffed0"} Apr 23 08:14:41.674952 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.674929 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerStarted","Data":"1ebffdcda85dbff7b63bb05019e51ce279fff3627dfafaed743b6a005e87c151"} Apr 23 08:14:41.675849 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:41.675826 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"c47cf367c474094cb172858c947f6a88001ed308d06f06b01c2fc95298fb234c"} Apr 23 08:14:42.153566 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:42.153528 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:42.153846 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:42.153823 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:42.153928 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:42.153852 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:42.153928 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:42.153865 2561 projected.go:194] Error preparing data for projected volume kube-api-access-gkltj for pod openshift-network-diagnostics/network-check-target-mbfqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:42.153928 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:42.153921 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj podName:59f9a0a5-064a-4dd4-9790-0bff108c8fbe nodeName:}" failed. No retries permitted until 2026-04-23 08:14:44.153903392 +0000 UTC m=+6.085266194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gkltj" (UniqueName: "kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj") pod "network-check-target-mbfqt" (UID: "59f9a0a5-064a-4dd4-9790-0bff108c8fbe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:42.254826 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:42.254794 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:42.254999 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:42.254977 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:42.255067 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:42.255047 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:44.255027676 +0000 UTC m=+6.186390456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:42.693628 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:42.692714 2561 generic.go:358] "Generic (PLEG): container finished" podID="327ce7e2a793cf29ed8c9455dfd4f163" containerID="c5172c64e2c8da529b6f054a73cd8e8846d66725dc263d7d5cd7de9a6fdebbd6" exitCode=0 Apr 23 08:14:42.693628 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:42.693376 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" event={"ID":"327ce7e2a793cf29ed8c9455dfd4f163","Type":"ContainerDied","Data":"c5172c64e2c8da529b6f054a73cd8e8846d66725dc263d7d5cd7de9a6fdebbd6"} Apr 23 08:14:42.709920 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:42.709866 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-8.ec2.internal" podStartSLOduration=2.709851213 podStartE2EDuration="2.709851213s" podCreationTimestamp="2026-04-23 08:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:41.680332022 +0000 UTC m=+3.611694852" watchObservedRunningTime="2026-04-23 08:14:42.709851213 +0000 UTC m=+4.641214015" Apr 23 08:14:43.660523 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:43.660495 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:43.660698 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:43.660620 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:43.660698 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:43.660628 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:43.660811 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:43.660735 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:43.720407 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:43.720374 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" event={"ID":"327ce7e2a793cf29ed8c9455dfd4f163","Type":"ContainerStarted","Data":"aeed6df74d4a27f1b5ee1742692f5fdbb1e949c955bf09bb95d167bdb5809f55"} Apr 23 08:14:43.741476 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:43.741423 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-8.ec2.internal" podStartSLOduration=3.7414066679999998 podStartE2EDuration="3.741406668s" podCreationTimestamp="2026-04-23 08:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:43.740552297 +0000 UTC m=+5.671915098" watchObservedRunningTime="2026-04-23 08:14:43.741406668 +0000 UTC m=+5.672769471" Apr 23 08:14:44.171998 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:44.171945 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:44.172165 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:44.172115 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:44.172165 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:44.172133 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:44.172165 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:44.172145 2561 projected.go:194] Error preparing data for projected volume kube-api-access-gkltj for pod openshift-network-diagnostics/network-check-target-mbfqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:44.172390 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:44.172202 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj podName:59f9a0a5-064a-4dd4-9790-0bff108c8fbe nodeName:}" failed. No retries permitted until 2026-04-23 08:14:48.172184346 +0000 UTC m=+10.103547143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gkltj" (UniqueName: "kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj") pod "network-check-target-mbfqt" (UID: "59f9a0a5-064a-4dd4-9790-0bff108c8fbe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:44.272989 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:44.272952 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:44.273229 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:44.273195 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:44.273343 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:44.273295 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:48.273275352 +0000 UTC m=+10.204638147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:45.660118 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:45.660086 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:45.660583 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:45.660216 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:45.660583 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:45.660095 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:45.660712 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:45.660688 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:47.660241 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:47.660195 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:47.660726 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:47.660346 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:47.660726 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:47.660511 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:47.660726 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:47.660614 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:48.205223 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:48.205183 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:48.205410 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:48.205373 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:48.205410 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:48.205400 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:48.205533 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:48.205424 2561 projected.go:194] Error preparing data for projected volume kube-api-access-gkltj for pod openshift-network-diagnostics/network-check-target-mbfqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:48.205533 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:48.205490 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj podName:59f9a0a5-064a-4dd4-9790-0bff108c8fbe nodeName:}" failed. No retries permitted until 2026-04-23 08:14:56.205470739 +0000 UTC m=+18.136833533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gkltj" (UniqueName: "kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj") pod "network-check-target-mbfqt" (UID: "59f9a0a5-064a-4dd4-9790-0bff108c8fbe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:48.306050 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:48.305969 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:48.306239 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:48.306093 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:48.306239 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:48.306168 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:56.306149407 +0000 UTC m=+18.237512192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:49.660700 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:49.660659 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:49.661110 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:49.660663 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:49.661110 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:49.660783 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:49.661110 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:49.660843 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:51.660817 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:51.660776 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:51.661236 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:51.660782 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:51.661236 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:51.660910 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:51.661236 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:51.660987 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:53.660695 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:53.660663 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:53.661158 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:53.660704 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:53.661158 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:53.660795 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:53.661158 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:53.660908 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:55.660773 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:55.660734 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:55.661301 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:55.660737 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:55.661301 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:55.660854 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:55.661301 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:55.660973 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:56.265923 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:56.265882 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:56.266097 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:56.266073 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:56.266175 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:56.266099 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:56.266175 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:56.266113 2561 projected.go:194] Error preparing data for projected volume kube-api-access-gkltj for pod openshift-network-diagnostics/network-check-target-mbfqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:56.266360 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:56.266181 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj podName:59f9a0a5-064a-4dd4-9790-0bff108c8fbe nodeName:}" failed. No retries permitted until 2026-04-23 08:15:12.266161932 +0000 UTC m=+34.197524724 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gkltj" (UniqueName: "kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj") pod "network-check-target-mbfqt" (UID: "59f9a0a5-064a-4dd4-9790-0bff108c8fbe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:56.366854 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:56.366816 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:56.367007 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:56.366939 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:56.367007 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:56.367000 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:12.366980939 +0000 UTC m=+34.298343737 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:57.660238 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:57.660201 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:57.660702 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:57.660344 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:57.660702 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:57.660398 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:57.660702 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:57.660498 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:58.746856 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.746832 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zngnf" event={"ID":"2b615e73-dc52-4885-94d7-dc4fecd877f6","Type":"ContainerStarted","Data":"4f2a9447dba82045944bd782c6819d390809b8072339a2c0a987d7072d3b9c5f"} Apr 23 08:14:58.748679 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.748652 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jgxcr" event={"ID":"892bfeb4-76ad-49cf-b615-dfa772b87a7e","Type":"ContainerStarted","Data":"0522d1d9ce172ad9dc26040b25bf70acf33f9f88b2e504bdcf0461820daf6429"} Apr 23 08:14:58.750683 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.750659 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mht6n" event={"ID":"6b438d86-63e2-4e66-a166-475de69c7900","Type":"ContainerStarted","Data":"66c1e8c02715d8291122e60b1e0a5854db0bac51ea85cd8230577ae9f0cca86c"} Apr 23 08:14:58.751869 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.751851 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerStarted","Data":"7241e4d4b8500647dcfdaccc3a493c8ee493c530d586a0510140f9c38ab67bfd"} Apr 23 08:14:58.753214 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.753189 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"1bbd14b7ffba1f92cf569ace76f13f59adda6f4db57ddaf7360c0932088f029a"} Apr 23 08:14:58.753214 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.753211 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"8378f213817fa9bbfaefa8572a533aa700badd44ddd16c09e059df0007a4be07"} Apr 23 08:14:58.754299 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.754280 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" event={"ID":"45bb6f45-fcf6-459d-bd44-766ec463ddb5","Type":"ContainerStarted","Data":"bb1dc049bb48e98409b86df2ef49863474e93cf5a35bccf6588c4b9961aec6ff"} Apr 23 08:14:58.755453 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.755434 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-86wvz" event={"ID":"3f5d8347-124b-469f-8ac6-0c963d6c4634","Type":"ContainerStarted","Data":"d3d000361973451b1044993b6363a30036e162ecc9848a677a71acb5eef03465"} Apr 23 08:14:58.756576 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.756556 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vdfxl" event={"ID":"556cc9f0-a576-455e-b539-83577cba025c","Type":"ContainerStarted","Data":"da4ae2127ff382ccf33a3abd950d00c91ea8e58945d305a2b29c4b06d8eca386"} Apr 23 08:14:58.763304 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.763246 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zngnf" podStartSLOduration=3.84966725 podStartE2EDuration="20.763236408s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.447224554 +0000 UTC m=+3.378587344" lastFinishedPulling="2026-04-23 08:14:58.360793712 +0000 UTC m=+20.292156502" observedRunningTime="2026-04-23 08:14:58.762826426 +0000 UTC m=+20.694189227" watchObservedRunningTime="2026-04-23 08:14:58.763236408 +0000 UTC m=+20.694599208" Apr 23 08:14:58.777019 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.776983 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mht6n" podStartSLOduration=8.512262109 podStartE2EDuration="20.776973857s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.442476747 +0000 UTC m=+3.373839543" lastFinishedPulling="2026-04-23 08:14:53.707188503 +0000 UTC m=+15.638551291" observedRunningTime="2026-04-23 08:14:58.776532586 +0000 UTC m=+20.707895387" watchObservedRunningTime="2026-04-23 08:14:58.776973857 +0000 UTC m=+20.708336658" Apr 23 08:14:58.790006 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.789974 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-86wvz" podStartSLOduration=3.873374955 podStartE2EDuration="20.789966371s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.442497899 +0000 UTC m=+3.373860689" lastFinishedPulling="2026-04-23 08:14:58.359089322 +0000 UTC m=+20.290452105" observedRunningTime="2026-04-23 08:14:58.789738555 +0000 UTC m=+20.721101356" watchObservedRunningTime="2026-04-23 08:14:58.789966371 +0000 UTC m=+20.721329171" Apr 23 08:14:58.806130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:58.806095 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jgxcr" podStartSLOduration=3.879005663 podStartE2EDuration="20.806085165s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.447145694 +0000 UTC m=+3.378508490" lastFinishedPulling="2026-04-23 08:14:58.374225198 +0000 UTC m=+20.305587992" observedRunningTime="2026-04-23 08:14:58.805758284 +0000 UTC m=+20.737121087" watchObservedRunningTime="2026-04-23 08:14:58.806085165 +0000 UTC m=+20.737447967" Apr 23 08:14:59.200240 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.200209 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:59.200833 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.200812 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:14:59.218514 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.218465 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vdfxl" podStartSLOduration=4.323115699 podStartE2EDuration="21.218449951s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.421228687 +0000 UTC m=+3.352591475" lastFinishedPulling="2026-04-23 08:14:58.316562948 +0000 UTC m=+20.247925727" observedRunningTime="2026-04-23 08:14:58.841759057 +0000 UTC m=+20.773121858" watchObservedRunningTime="2026-04-23 08:14:59.218449951 +0000 UTC m=+21.149812800" Apr 23 08:14:59.660363 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.660283 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:14:59.660517 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.660283 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:14:59.660517 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:59.660419 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:14:59.660517 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:14:59.660444 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:14:59.759807 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.759781 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l76bb" event={"ID":"02a74ef7-7607-44c8-9c82-00f2c73ba0e8","Type":"ContainerStarted","Data":"d48c23c549a3c9e4d632ddf42fbde22ea2e862d6c6059eadada65fc88992fb02"} Apr 23 08:14:59.761168 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.761143 2561 generic.go:358] "Generic (PLEG): container finished" podID="0a8488f0-d2d8-4107-b542-5f46729c4927" containerID="7241e4d4b8500647dcfdaccc3a493c8ee493c530d586a0510140f9c38ab67bfd" exitCode=0 Apr 23 08:14:59.761277 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.761214 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerDied","Data":"7241e4d4b8500647dcfdaccc3a493c8ee493c530d586a0510140f9c38ab67bfd"} Apr 23 08:14:59.764212 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.764193 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"fc79c49313202589f7c9e48b1190cf2cca9975a3ea49732517be59d16dbf5a25"} Apr 23 08:14:59.764315 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.764221 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"5c79ccedead728103ca84a204882ced1341cad2e3c49aed05540b65507132299"} Apr 23 08:14:59.764315 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.764237 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"819c2e247acf4c3c18a63d076f471184837a83e48cef0fd5c1073e9165780411"} Apr 23 08:14:59.764315 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.764251 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"084b4896444d8311d2ac3fafb41441441ceab3a3294a72b9754d1d16c38ec50e"} Apr 23 08:14:59.774893 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.774849 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-l76bb" podStartSLOduration=4.85840302 podStartE2EDuration="21.774834834s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.442592289 +0000 UTC m=+3.373955080" lastFinishedPulling="2026-04-23 08:14:58.359024105 +0000 UTC m=+20.290386894" observedRunningTime="2026-04-23 08:14:59.774616423 +0000 UTC m=+21.705979227" watchObservedRunningTime="2026-04-23 08:14:59.774834834 +0000 UTC m=+21.706197636" Apr 23 08:14:59.899191 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:14:59.899169 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:15:00.593577 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:00.593450 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:14:59.899187691Z","UUID":"804ff8d3-af2e-4b90-a645-172fb3a12b7a","Handler":null,"Name":"","Endpoint":""} Apr 23 08:15:00.595177 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:00.595143 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:15:00.595177 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:00.595172 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:15:00.768036 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:00.767997 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" event={"ID":"45bb6f45-fcf6-459d-bd44-766ec463ddb5","Type":"ContainerStarted","Data":"a632ec172a43b37a7b6cd386c1c2a1c192b09a0843633c6ce905ed2dc2898197"} Apr 23 08:15:00.768036 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:00.768036 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:15:01.660934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:01.660718 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:01.661091 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:01.660718 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:01.661091 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:01.660981 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:15:01.661091 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:01.661016 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:15:01.771894 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:01.771860 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" event={"ID":"45bb6f45-fcf6-459d-bd44-766ec463ddb5","Type":"ContainerStarted","Data":"4cf68a6444d3e512c9a53cddcb0827bf476467403108a7d14eb4ece750d311f9"} Apr 23 08:15:01.774877 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:01.774854 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"902e02637502f2af71615bfb96d3372d42fdaf7e699ae434631b31bd61e31f53"} Apr 23 08:15:01.789236 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:01.789189 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gw5ts" podStartSLOduration=4.177837679 podStartE2EDuration="23.789176764s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.44714884 +0000 UTC m=+3.378511631" lastFinishedPulling="2026-04-23 08:15:01.058487924 +0000 UTC m=+22.989850716" observedRunningTime="2026-04-23 08:15:01.788952185 +0000 UTC m=+23.720314990" watchObservedRunningTime="2026-04-23 08:15:01.789176764 +0000 UTC m=+23.720539566" Apr 23 08:15:03.156435 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:03.156393 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:15:03.156898 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:03.156555 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:15:03.157084 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:03.157058 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mht6n" Apr 23 08:15:03.660394 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:03.660358 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:03.660596 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:03.660368 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:03.660596 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:03.660474 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:15:03.660596 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:03.660565 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:15:04.781100 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:04.780904 2561 generic.go:358] "Generic (PLEG): container finished" podID="0a8488f0-d2d8-4107-b542-5f46729c4927" containerID="3c6c231909fbf714696254f69be91ed1f075b9d847f15a61b08da4b556e7f6d9" exitCode=0 Apr 23 08:15:04.781913 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:04.780994 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerDied","Data":"3c6c231909fbf714696254f69be91ed1f075b9d847f15a61b08da4b556e7f6d9"} Apr 23 08:15:04.784331 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:04.784310 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" event={"ID":"05731c48-9bfe-46ed-8390-b6d811272383","Type":"ContainerStarted","Data":"4b8bab81dfb2b2a9dcc0fc69af20b3bdd4089c22de03532ac41abf7cb130a149"} Apr 23 08:15:04.784659 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:04.784643 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:15:04.784722 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:04.784671 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:15:04.798688 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:04.798672 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:15:04.827195 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:04.827159 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" podStartSLOduration=9.836258865 podStartE2EDuration="26.827147805s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.415808838 +0000 UTC m=+3.347171620" lastFinishedPulling="2026-04-23 08:14:58.406697781 +0000 UTC m=+20.338060560" observedRunningTime="2026-04-23 08:15:04.827130298 +0000 UTC m=+26.758493109" watchObservedRunningTime="2026-04-23 08:15:04.827147805 +0000 UTC m=+26.758510605" Apr 23 08:15:05.660462 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:05.660428 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:05.660642 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:05.660427 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:05.660642 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:05.660560 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:15:05.660642 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:05.660607 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:15:05.788635 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:05.788601 2561 generic.go:358] "Generic (PLEG): container finished" podID="0a8488f0-d2d8-4107-b542-5f46729c4927" containerID="bae9f938e2d0a47a219b19034846710c78b33861000d9382e6fb379eb920ad04" exitCode=0 Apr 23 08:15:05.789013 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:05.788682 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerDied","Data":"bae9f938e2d0a47a219b19034846710c78b33861000d9382e6fb379eb920ad04"} Apr 23 08:15:05.789313 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:05.789292 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:15:05.803649 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:05.803624 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:15:06.312428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:06.312215 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pmv55"] Apr 23 08:15:06.312428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:06.312370 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:06.312661 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:06.312500 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:15:06.312871 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:06.312844 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mbfqt"] Apr 23 08:15:06.313000 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:06.312925 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:06.313069 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:06.313005 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:15:06.792213 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:06.792133 2561 generic.go:358] "Generic (PLEG): container finished" podID="0a8488f0-d2d8-4107-b542-5f46729c4927" containerID="b3eb47a7860eff08f5ea5df8c6156fd97f33543f400d96149ab678a50960a024" exitCode=0 Apr 23 08:15:06.792593 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:06.792213 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerDied","Data":"b3eb47a7860eff08f5ea5df8c6156fd97f33543f400d96149ab678a50960a024"} Apr 23 08:15:07.659965 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:07.659816 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:07.660115 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:07.660034 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:15:08.661109 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:08.661071 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:08.661546 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:08.661175 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:15:09.660061 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:09.660028 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:09.660204 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:09.660167 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbfqt" podUID="59f9a0a5-064a-4dd4-9790-0bff108c8fbe" Apr 23 08:15:10.660150 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:10.660115 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:10.660669 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:10.660236 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:15:11.402880 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.402850 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-8.ec2.internal" event="NodeReady" Apr 23 08:15:11.403069 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.402988 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:15:11.452883 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.452848 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-46nvx"] Apr 23 08:15:11.457658 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.457637 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ttph8"] Apr 23 08:15:11.457804 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.457785 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.460442 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.460423 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:11.460893 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.460874 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:15:11.460976 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.460925 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:15:11.461193 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.461178 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f2cmm\"" Apr 23 08:15:11.463051 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.463035 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:15:11.464124 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.464107 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:15:11.464614 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.464595 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:15:11.465052 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.465029 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkwh7\"" Apr 23 08:15:11.471974 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.471953 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ttph8"] Apr 23 08:15:11.472720 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.472690 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-46nvx"] Apr 23 08:15:11.588050 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.588021 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jzm\" (UniqueName: \"kubernetes.io/projected/3ada2676-04c4-4126-a943-cd1d167949aa-kube-api-access-k9jzm\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.588210 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.588083 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.588210 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.588138 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ada2676-04c4-4126-a943-cd1d167949aa-config-volume\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.588210 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.588157 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ada2676-04c4-4126-a943-cd1d167949aa-tmp-dir\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.588210 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.588179 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:11.588210 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.588206 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxjw\" (UniqueName: \"kubernetes.io/projected/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-kube-api-access-5nxjw\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:11.660122 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.660089 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:11.663283 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.663246 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p5rcb\"" Apr 23 08:15:11.663852 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.663247 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:15:11.663852 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.663294 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:15:11.688623 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.688598 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxjw\" (UniqueName: \"kubernetes.io/projected/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-kube-api-access-5nxjw\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:11.688749 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.688638 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jzm\" (UniqueName: \"kubernetes.io/projected/3ada2676-04c4-4126-a943-cd1d167949aa-kube-api-access-k9jzm\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.688906 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.688885 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.688989 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.688936 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ada2676-04c4-4126-a943-cd1d167949aa-config-volume\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.688989 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.688960 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ada2676-04c4-4126-a943-cd1d167949aa-tmp-dir\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.688989 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.688984 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:11.689147 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:11.689024 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:11.689147 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:11.689079 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:11.689147 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:11.689100 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls podName:3ada2676-04c4-4126-a943-cd1d167949aa nodeName:}" failed. No retries permitted until 2026-04-23 08:15:12.189079268 +0000 UTC m=+34.120442046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls") pod "dns-default-46nvx" (UID: "3ada2676-04c4-4126-a943-cd1d167949aa") : secret "dns-default-metrics-tls" not found Apr 23 08:15:11.689147 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:11.689136 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert podName:c7c0ad21-b2af-4a80-a79c-000cff3a91ab nodeName:}" failed. No retries permitted until 2026-04-23 08:15:12.189120789 +0000 UTC m=+34.120483570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert") pod "ingress-canary-ttph8" (UID: "c7c0ad21-b2af-4a80-a79c-000cff3a91ab") : secret "canary-serving-cert" not found Apr 23 08:15:11.689384 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.689332 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ada2676-04c4-4126-a943-cd1d167949aa-tmp-dir\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.689564 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.689546 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ada2676-04c4-4126-a943-cd1d167949aa-config-volume\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.700085 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.700061 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jzm\" (UniqueName: \"kubernetes.io/projected/3ada2676-04c4-4126-a943-cd1d167949aa-kube-api-access-k9jzm\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:11.700185 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:11.700115 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxjw\" (UniqueName: \"kubernetes.io/projected/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-kube-api-access-5nxjw\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:12.193815 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.193765 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:12.194005 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:12.193939 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:12.194005 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.193948 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:12.194104 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:12.194027 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls podName:3ada2676-04c4-4126-a943-cd1d167949aa nodeName:}" failed. No retries permitted until 2026-04-23 08:15:13.194005643 +0000 UTC m=+35.125368426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls") pod "dns-default-46nvx" (UID: "3ada2676-04c4-4126-a943-cd1d167949aa") : secret "dns-default-metrics-tls" not found Apr 23 08:15:12.194104 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:12.194073 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:12.194209 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:12.194123 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert podName:c7c0ad21-b2af-4a80-a79c-000cff3a91ab nodeName:}" failed. No retries permitted until 2026-04-23 08:15:13.194107929 +0000 UTC m=+35.125470707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert") pod "ingress-canary-ttph8" (UID: "c7c0ad21-b2af-4a80-a79c-000cff3a91ab") : secret "canary-serving-cert" not found Apr 23 08:15:12.295209 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.295171 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:12.298009 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.297981 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkltj\" (UniqueName: \"kubernetes.io/projected/59f9a0a5-064a-4dd4-9790-0bff108c8fbe-kube-api-access-gkltj\") pod \"network-check-target-mbfqt\" (UID: \"59f9a0a5-064a-4dd4-9790-0bff108c8fbe\") " pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:12.396170 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.396134 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:12.396350 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:12.396310 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:15:12.396401 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:12.396372 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:44.396358082 +0000 UTC m=+66.327720861 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:15:12.570747 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.570714 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:12.660215 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.660176 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:12.663273 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.663234 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:15:12.663273 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:12.663237 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vjj4z\"" Apr 23 08:15:13.202181 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:13.202140 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:13.202387 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:13.202214 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:13.202387 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:13.202323 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:13.202387 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:13.202356 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:13.202547 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:13.202398 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls podName:3ada2676-04c4-4126-a943-cd1d167949aa nodeName:}" failed. No retries permitted until 2026-04-23 08:15:15.202377117 +0000 UTC m=+37.133739907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls") pod "dns-default-46nvx" (UID: "3ada2676-04c4-4126-a943-cd1d167949aa") : secret "dns-default-metrics-tls" not found Apr 23 08:15:13.202547 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:13.202416 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert podName:c7c0ad21-b2af-4a80-a79c-000cff3a91ab nodeName:}" failed. No retries permitted until 2026-04-23 08:15:15.202407742 +0000 UTC m=+37.133770524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert") pod "ingress-canary-ttph8" (UID: "c7c0ad21-b2af-4a80-a79c-000cff3a91ab") : secret "canary-serving-cert" not found Apr 23 08:15:13.777351 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:13.777102 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mbfqt"] Apr 23 08:15:13.781543 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:15:13.781508 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59f9a0a5_064a_4dd4_9790_0bff108c8fbe.slice/crio-1db2166ed91b3d62a2c2ea61bfa1fc700e3dc5786ee1779c96afe3ed4a172f48 WatchSource:0}: Error finding container 1db2166ed91b3d62a2c2ea61bfa1fc700e3dc5786ee1779c96afe3ed4a172f48: Status 404 returned error can't find the container with id 1db2166ed91b3d62a2c2ea61bfa1fc700e3dc5786ee1779c96afe3ed4a172f48 Apr 23 08:15:13.808571 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:13.808546 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mbfqt" event={"ID":"59f9a0a5-064a-4dd4-9790-0bff108c8fbe","Type":"ContainerStarted","Data":"1db2166ed91b3d62a2c2ea61bfa1fc700e3dc5786ee1779c96afe3ed4a172f48"} Apr 23 08:15:14.813406 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:14.813363 2561 generic.go:358] "Generic (PLEG): container finished" podID="0a8488f0-d2d8-4107-b542-5f46729c4927" containerID="2e9dd4c7828cab0129d11727045c77ab9fca5d78c1774bdac9d178b57e37bbe5" exitCode=0 Apr 23 08:15:14.813824 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:14.813422 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerDied","Data":"2e9dd4c7828cab0129d11727045c77ab9fca5d78c1774bdac9d178b57e37bbe5"} Apr 23 08:15:15.229843 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:15.229811 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:15.230033 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:15.229873 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:15.230033 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:15.229989 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:15.230033 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:15.229991 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:15.230190 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:15.230038 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert podName:c7c0ad21-b2af-4a80-a79c-000cff3a91ab nodeName:}" failed. No retries permitted until 2026-04-23 08:15:19.230024609 +0000 UTC m=+41.161387388 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert") pod "ingress-canary-ttph8" (UID: "c7c0ad21-b2af-4a80-a79c-000cff3a91ab") : secret "canary-serving-cert" not found Apr 23 08:15:15.230190 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:15.230056 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls podName:3ada2676-04c4-4126-a943-cd1d167949aa nodeName:}" failed. No retries permitted until 2026-04-23 08:15:19.230047394 +0000 UTC m=+41.161410175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls") pod "dns-default-46nvx" (UID: "3ada2676-04c4-4126-a943-cd1d167949aa") : secret "dns-default-metrics-tls" not found Apr 23 08:15:15.817558 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:15.817527 2561 generic.go:358] "Generic (PLEG): container finished" podID="0a8488f0-d2d8-4107-b542-5f46729c4927" containerID="bbaa5aca1f0ef60d48870aa8e83389746203eaa43e3006ef833a077ac2112e7f" exitCode=0 Apr 23 08:15:15.818084 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:15.817582 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerDied","Data":"bbaa5aca1f0ef60d48870aa8e83389746203eaa43e3006ef833a077ac2112e7f"} Apr 23 08:15:17.823731 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:17.823533 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg44l" event={"ID":"0a8488f0-d2d8-4107-b542-5f46729c4927","Type":"ContainerStarted","Data":"a614692e09b2586bef744390811348baf508640398bdd6bb8ecf31c30ca2c094"} Apr 23 08:15:17.824740 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:17.824719 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mbfqt" event={"ID":"59f9a0a5-064a-4dd4-9790-0bff108c8fbe","Type":"ContainerStarted","Data":"13707b3483c3fa359970f112d36404a931a879e31f87dc73a515ea9a5899f059"} Apr 23 08:15:17.824841 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:17.824827 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:15:17.849056 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:17.848999 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hg44l" podStartSLOduration=7.468221933 podStartE2EDuration="39.848987881s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:14:41.423480206 +0000 UTC m=+3.354842989" lastFinishedPulling="2026-04-23 08:15:13.80424614 +0000 UTC m=+35.735608937" observedRunningTime="2026-04-23 08:15:17.847341426 +0000 UTC m=+39.778704227" watchObservedRunningTime="2026-04-23 08:15:17.848987881 +0000 UTC m=+39.780350682" Apr 23 08:15:19.258335 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:19.258295 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:19.258691 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:19.258347 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:19.258691 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:19.258434 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:19.258691 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:19.258509 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls podName:3ada2676-04c4-4126-a943-cd1d167949aa nodeName:}" failed. No retries permitted until 2026-04-23 08:15:27.258492885 +0000 UTC m=+49.189855664 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls") pod "dns-default-46nvx" (UID: "3ada2676-04c4-4126-a943-cd1d167949aa") : secret "dns-default-metrics-tls" not found Apr 23 08:15:19.258691 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:19.258441 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:19.258691 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:19.258567 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert podName:c7c0ad21-b2af-4a80-a79c-000cff3a91ab nodeName:}" failed. No retries permitted until 2026-04-23 08:15:27.258555769 +0000 UTC m=+49.189918548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert") pod "ingress-canary-ttph8" (UID: "c7c0ad21-b2af-4a80-a79c-000cff3a91ab") : secret "canary-serving-cert" not found Apr 23 08:15:27.304014 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:27.303978 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:27.304500 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:27.304026 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:27.304500 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:27.304108 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:27.304500 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:27.304109 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:27.304500 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:27.304174 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert podName:c7c0ad21-b2af-4a80-a79c-000cff3a91ab nodeName:}" failed. No retries permitted until 2026-04-23 08:15:43.304161224 +0000 UTC m=+65.235524003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert") pod "ingress-canary-ttph8" (UID: "c7c0ad21-b2af-4a80-a79c-000cff3a91ab") : secret "canary-serving-cert" not found Apr 23 08:15:27.304500 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:27.304187 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls podName:3ada2676-04c4-4126-a943-cd1d167949aa nodeName:}" failed. No retries permitted until 2026-04-23 08:15:43.304181263 +0000 UTC m=+65.235544042 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls") pod "dns-default-46nvx" (UID: "3ada2676-04c4-4126-a943-cd1d167949aa") : secret "dns-default-metrics-tls" not found Apr 23 08:15:37.804381 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:37.804354 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5wkc" Apr 23 08:15:37.833687 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:37.833644 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mbfqt" podStartSLOduration=56.432309617 podStartE2EDuration="59.83363222s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:15:13.783464904 +0000 UTC m=+35.714827686" lastFinishedPulling="2026-04-23 08:15:17.184787493 +0000 UTC m=+39.116150289" observedRunningTime="2026-04-23 08:15:17.861247047 +0000 UTC m=+39.792609852" watchObservedRunningTime="2026-04-23 08:15:37.83363222 +0000 UTC m=+59.764995025" Apr 23 08:15:43.401076 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:43.401033 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:15:43.401542 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:43.401095 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:15:43.401542 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:43.401175 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:43.401542 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:43.401176 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:43.401542 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:43.401233 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls podName:3ada2676-04c4-4126-a943-cd1d167949aa nodeName:}" failed. No retries permitted until 2026-04-23 08:16:15.401219535 +0000 UTC m=+97.332582313 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls") pod "dns-default-46nvx" (UID: "3ada2676-04c4-4126-a943-cd1d167949aa") : secret "dns-default-metrics-tls" not found Apr 23 08:15:43.401542 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:43.401246 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert podName:c7c0ad21-b2af-4a80-a79c-000cff3a91ab nodeName:}" failed. No retries permitted until 2026-04-23 08:16:15.40124069 +0000 UTC m=+97.332603469 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert") pod "ingress-canary-ttph8" (UID: "c7c0ad21-b2af-4a80-a79c-000cff3a91ab") : secret "canary-serving-cert" not found Apr 23 08:15:44.408318 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:44.408283 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:15:44.411533 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:44.411518 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:15:44.419030 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:44.419015 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:15:44.419085 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:15:44.419075 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:48.419060512 +0000 UTC m=+130.350423291 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : secret "metrics-daemon-secret" not found Apr 23 08:15:48.829113 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:15:48.829081 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mbfqt" Apr 23 08:16:15.411490 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:15.411340 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:16:15.411490 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:15.411410 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:16:15.411987 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:15.411503 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:16:15.411987 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:15.411506 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:16:15.411987 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:15.411568 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert podName:c7c0ad21-b2af-4a80-a79c-000cff3a91ab nodeName:}" failed. No retries permitted until 2026-04-23 08:17:19.411548787 +0000 UTC m=+161.342911566 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert") pod "ingress-canary-ttph8" (UID: "c7c0ad21-b2af-4a80-a79c-000cff3a91ab") : secret "canary-serving-cert" not found Apr 23 08:16:15.411987 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:15.411584 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls podName:3ada2676-04c4-4126-a943-cd1d167949aa nodeName:}" failed. No retries permitted until 2026-04-23 08:17:19.411576891 +0000 UTC m=+161.342939670 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls") pod "dns-default-46nvx" (UID: "3ada2676-04c4-4126-a943-cd1d167949aa") : secret "dns-default-metrics-tls" not found Apr 23 08:16:27.176305 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.176255 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v"] Apr 23 08:16:27.180407 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.180381 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-km78h"] Apr 23 08:16:27.180562 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.180542 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.183016 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.182997 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.183226 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.183201 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:16:27.183483 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.183467 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 08:16:27.183570 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.183554 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:16:27.184777 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.184759 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 08:16:27.184849 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.184801 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-49dql\"" Apr 23 08:16:27.185668 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.185651 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 08:16:27.185770 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.185655 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 08:16:27.186072 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.186055 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-mp7lp\"" Apr 23 08:16:27.186072 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.186064 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:16:27.186245 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.186059 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:16:27.190472 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.190455 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-km78h"] Apr 23 08:16:27.190610 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.190559 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 08:16:27.191133 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.191115 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v"] Apr 23 08:16:27.289787 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.289761 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.289787 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.289788 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhkw6\" (UniqueName: \"kubernetes.io/projected/2c1dd227-3279-4f30-b918-473a6a080619-kube-api-access-lhkw6\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.289978 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.289805 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cdf9e60-6a76-44c7-a819-39654b29c96a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.289978 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.289822 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2c1dd227-3279-4f30-b918-473a6a080619-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.289978 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.289845 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2cdf9e60-6a76-44c7-a819-39654b29c96a-snapshots\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.289978 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.289901 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cdf9e60-6a76-44c7-a819-39654b29c96a-serving-cert\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.289978 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.289963 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jsn9\" (UniqueName: \"kubernetes.io/projected/2cdf9e60-6a76-44c7-a819-39654b29c96a-kube-api-access-2jsn9\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.290170 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.290001 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cdf9e60-6a76-44c7-a819-39654b29c96a-tmp\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.290170 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.290016 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cdf9e60-6a76-44c7-a819-39654b29c96a-service-ca-bundle\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.390464 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390432 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.390583 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390473 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhkw6\" (UniqueName: \"kubernetes.io/projected/2c1dd227-3279-4f30-b918-473a6a080619-kube-api-access-lhkw6\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.390583 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390496 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cdf9e60-6a76-44c7-a819-39654b29c96a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.390583 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390523 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2c1dd227-3279-4f30-b918-473a6a080619-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.390583 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390548 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2cdf9e60-6a76-44c7-a819-39654b29c96a-snapshots\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.390583 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:27.390571 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:27.390818 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:27.390660 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls podName:2c1dd227-3279-4f30-b918-473a6a080619 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:27.890636287 +0000 UTC m=+109.821999083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cbh4v" (UID: "2c1dd227-3279-4f30-b918-473a6a080619") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:27.390818 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390573 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cdf9e60-6a76-44c7-a819-39654b29c96a-serving-cert\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.390818 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390755 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jsn9\" (UniqueName: \"kubernetes.io/projected/2cdf9e60-6a76-44c7-a819-39654b29c96a-kube-api-access-2jsn9\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.390818 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390805 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cdf9e60-6a76-44c7-a819-39654b29c96a-tmp\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.391022 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.390832 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cdf9e60-6a76-44c7-a819-39654b29c96a-service-ca-bundle\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.391280 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.391235 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2cdf9e60-6a76-44c7-a819-39654b29c96a-snapshots\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.391407 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.391237 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cdf9e60-6a76-44c7-a819-39654b29c96a-tmp\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.391407 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.391329 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2c1dd227-3279-4f30-b918-473a6a080619-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.391407 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.391387 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cdf9e60-6a76-44c7-a819-39654b29c96a-service-ca-bundle\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.391566 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.391537 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cdf9e60-6a76-44c7-a819-39654b29c96a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.394322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.394304 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cdf9e60-6a76-44c7-a819-39654b29c96a-serving-cert\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.399507 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.399480 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhkw6\" (UniqueName: \"kubernetes.io/projected/2c1dd227-3279-4f30-b918-473a6a080619-kube-api-access-lhkw6\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.399682 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.399666 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jsn9\" (UniqueName: \"kubernetes.io/projected/2cdf9e60-6a76-44c7-a819-39654b29c96a-kube-api-access-2jsn9\") pod \"insights-operator-585dfdc468-km78h\" (UID: \"2cdf9e60-6a76-44c7-a819-39654b29c96a\") " pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.496424 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.496403 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-km78h" Apr 23 08:16:27.604984 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.604951 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-km78h"] Apr 23 08:16:27.607523 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:27.607495 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cdf9e60_6a76_44c7_a819_39654b29c96a.slice/crio-d8e48b5761278395faae4fbc39ac8b8450c69412ceacb1bc3d538d7b77227597 WatchSource:0}: Error finding container d8e48b5761278395faae4fbc39ac8b8450c69412ceacb1bc3d538d7b77227597: Status 404 returned error can't find the container with id d8e48b5761278395faae4fbc39ac8b8450c69412ceacb1bc3d538d7b77227597 Apr 23 08:16:27.893772 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.893674 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:27.893916 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:27.893828 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:27.893916 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:27.893894 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls podName:2c1dd227-3279-4f30-b918-473a6a080619 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:28.893879158 +0000 UTC m=+110.825241941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cbh4v" (UID: "2c1dd227-3279-4f30-b918-473a6a080619") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:27.953933 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:27.953905 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-km78h" event={"ID":"2cdf9e60-6a76-44c7-a819-39654b29c96a","Type":"ContainerStarted","Data":"d8e48b5761278395faae4fbc39ac8b8450c69412ceacb1bc3d538d7b77227597"} Apr 23 08:16:28.900977 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:28.900940 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:28.901561 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:28.901061 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:28.901561 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:28.901116 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls podName:2c1dd227-3279-4f30-b918-473a6a080619 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:30.901101654 +0000 UTC m=+112.832464433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cbh4v" (UID: "2c1dd227-3279-4f30-b918-473a6a080619") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:29.958911 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:29.958872 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-km78h" event={"ID":"2cdf9e60-6a76-44c7-a819-39654b29c96a","Type":"ContainerStarted","Data":"c759433d42fa8e34fbe8df17aa7aa80694928d0341be963683f6775013fbabd8"} Apr 23 08:16:29.974966 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:29.974834 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-km78h" podStartSLOduration=1.12646422 podStartE2EDuration="2.974818838s" podCreationTimestamp="2026-04-23 08:16:27 +0000 UTC" firstStartedPulling="2026-04-23 08:16:27.609329116 +0000 UTC m=+109.540691901" lastFinishedPulling="2026-04-23 08:16:29.457683741 +0000 UTC m=+111.389046519" observedRunningTime="2026-04-23 08:16:29.974346396 +0000 UTC m=+111.905709198" watchObservedRunningTime="2026-04-23 08:16:29.974818838 +0000 UTC m=+111.906181641" Apr 23 08:16:30.917963 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:30.917919 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:30.918128 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:30.918060 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:30.918168 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:30.918127 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls podName:2c1dd227-3279-4f30-b918-473a6a080619 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:34.918109222 +0000 UTC m=+116.849472001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cbh4v" (UID: "2c1dd227-3279-4f30-b918-473a6a080619") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:33.169208 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:33.169179 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vdfxl_556cc9f0-a576-455e-b539-83577cba025c/dns-node-resolver/0.log" Apr 23 08:16:33.769245 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:33.769218 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-86wvz_3f5d8347-124b-469f-8ac6-0c963d6c4634/node-ca/0.log" Apr 23 08:16:34.945240 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:34.945207 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:34.945592 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:34.945361 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:34.945592 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:34.945427 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls podName:2c1dd227-3279-4f30-b918-473a6a080619 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:42.945409914 +0000 UTC m=+124.876772696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cbh4v" (UID: "2c1dd227-3279-4f30-b918-473a6a080619") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:37.144991 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.144959 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj"] Apr 23 08:16:37.147874 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.147858 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj" Apr 23 08:16:37.150494 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.150468 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 08:16:37.151518 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.151498 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:16:37.151578 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.151565 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-w7h7t\"" Apr 23 08:16:37.157165 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.157137 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj"] Apr 23 08:16:37.249033 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.249005 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz"] Apr 23 08:16:37.251798 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.251783 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:37.254630 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.254612 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 08:16:37.255902 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.255884 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm"] Apr 23 08:16:37.256782 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.256759 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-c2nnb\"" Apr 23 08:16:37.256990 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.256973 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:16:37.257100 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.256994 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 08:16:37.258478 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.258462 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-94fwt"] Apr 23 08:16:37.258597 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.258584 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.260345 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.260326 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htq8h\" (UniqueName: \"kubernetes.io/projected/e488b11b-324c-4989-b28f-8aaa6ecd0cab-kube-api-access-htq8h\") pod \"volume-data-source-validator-7c6cbb6c87-jmnkj\" (UID: \"e488b11b-324c-4989-b28f-8aaa6ecd0cab\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj" Apr 23 08:16:37.261032 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.261016 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 08:16:37.261090 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.261031 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 08:16:37.261355 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.261334 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:16:37.261442 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.261411 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q"] Apr 23 08:16:37.261507 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.261491 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 08:16:37.261552 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.261524 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-fnr2h\"" Apr 23 08:16:37.261603 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.261568 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.264074 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.264056 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:16:37.264165 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.264074 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 08:16:37.264165 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.264120 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-rwzd5\"" Apr 23 08:16:37.264165 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.264145 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz"] Apr 23 08:16:37.264165 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.264074 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 08:16:37.264400 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.264185 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 08:16:37.264400 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.264233 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.267452 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.267433 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 08:16:37.267564 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.267551 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 08:16:37.267693 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.267676 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 08:16:37.268367 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.268346 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7lrdj\"" Apr 23 08:16:37.268776 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.268755 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:16:37.269937 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.269918 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm"] Apr 23 08:16:37.270825 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.270805 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q"] Apr 23 08:16:37.271285 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.271254 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 08:16:37.281934 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.281914 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-94fwt"] Apr 23 08:16:37.361467 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361444 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfhlm\" (UniqueName: \"kubernetes.io/projected/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-kube-api-access-kfhlm\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.361593 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361474 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-config\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.361593 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361491 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx6dj\" (UniqueName: \"kubernetes.io/projected/d460c450-63cc-49ec-af6a-6618277ea5cf-kube-api-access-dx6dj\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.361593 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361508 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.361593 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361557 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.361738 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361628 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.361738 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361659 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d460c450-63cc-49ec-af6a-6618277ea5cf-serving-cert\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.361738 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361692 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d460c450-63cc-49ec-af6a-6618277ea5cf-trusted-ca\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.361825 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361741 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flq4c\" (UniqueName: \"kubernetes.io/projected/ad2217d3-d0c5-4c5e-a920-ef66096532e4-kube-api-access-flq4c\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:37.361825 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361759 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d460c450-63cc-49ec-af6a-6618277ea5cf-config\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.361825 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361789 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd2gm\" (UniqueName: \"kubernetes.io/projected/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-kube-api-access-cd2gm\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.361825 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361816 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:37.361941 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.361871 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htq8h\" (UniqueName: \"kubernetes.io/projected/e488b11b-324c-4989-b28f-8aaa6ecd0cab-kube-api-access-htq8h\") pod \"volume-data-source-validator-7c6cbb6c87-jmnkj\" (UID: \"e488b11b-324c-4989-b28f-8aaa6ecd0cab\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj" Apr 23 08:16:37.369953 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.369931 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htq8h\" (UniqueName: \"kubernetes.io/projected/e488b11b-324c-4989-b28f-8aaa6ecd0cab-kube-api-access-htq8h\") pod \"volume-data-source-validator-7c6cbb6c87-jmnkj\" (UID: \"e488b11b-324c-4989-b28f-8aaa6ecd0cab\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj" Apr 23 08:16:37.456011 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.455990 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj" Apr 23 08:16:37.462823 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.462805 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfhlm\" (UniqueName: \"kubernetes.io/projected/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-kube-api-access-kfhlm\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.462925 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.462838 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-config\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.462925 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.462863 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx6dj\" (UniqueName: \"kubernetes.io/projected/d460c450-63cc-49ec-af6a-6618277ea5cf-kube-api-access-dx6dj\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.462925 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.462890 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.462925 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.462914 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.463130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.462944 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.463130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.462985 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d460c450-63cc-49ec-af6a-6618277ea5cf-serving-cert\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.463130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.463010 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d460c450-63cc-49ec-af6a-6618277ea5cf-trusted-ca\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.463130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.463051 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flq4c\" (UniqueName: \"kubernetes.io/projected/ad2217d3-d0c5-4c5e-a920-ef66096532e4-kube-api-access-flq4c\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:37.463130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.463078 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d460c450-63cc-49ec-af6a-6618277ea5cf-config\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.463130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.463125 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd2gm\" (UniqueName: \"kubernetes.io/projected/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-kube-api-access-cd2gm\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.463667 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.463152 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:37.463667 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:37.463317 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:16:37.463667 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:37.463374 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls podName:ad2217d3-d0c5-4c5e-a920-ef66096532e4 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:37.963357174 +0000 UTC m=+119.894719956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f27qz" (UID: "ad2217d3-d0c5-4c5e-a920-ef66096532e4") : secret "samples-operator-tls" not found Apr 23 08:16:37.463667 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.463416 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-config\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.464100 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.464049 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d460c450-63cc-49ec-af6a-6618277ea5cf-config\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.464198 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.464110 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.464442 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.464425 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d460c450-63cc-49ec-af6a-6618277ea5cf-trusted-ca\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.465988 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.465964 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.466093 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.465964 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.466093 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.466072 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d460c450-63cc-49ec-af6a-6618277ea5cf-serving-cert\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.473754 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.473735 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx6dj\" (UniqueName: \"kubernetes.io/projected/d460c450-63cc-49ec-af6a-6618277ea5cf-kube-api-access-dx6dj\") pod \"console-operator-9d4b6777b-94fwt\" (UID: \"d460c450-63cc-49ec-af6a-6618277ea5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.474027 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.474004 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfhlm\" (UniqueName: \"kubernetes.io/projected/a42ca4f9-a9ae-4413-9c3d-fa18098d565a-kube-api-access-kfhlm\") pod \"kube-storage-version-migrator-operator-6769c5d45-6qc5q\" (UID: \"a42ca4f9-a9ae-4413-9c3d-fa18098d565a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.474863 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.474838 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flq4c\" (UniqueName: \"kubernetes.io/projected/ad2217d3-d0c5-4c5e-a920-ef66096532e4-kube-api-access-flq4c\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:37.475072 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.475055 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd2gm\" (UniqueName: \"kubernetes.io/projected/d38a2fb0-c776-4a02-95f3-1c68963e1ef7-kube-api-access-cd2gm\") pod \"service-ca-operator-d6fc45fc5-zgcjm\" (UID: \"d38a2fb0-c776-4a02-95f3-1c68963e1ef7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.569913 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.569871 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" Apr 23 08:16:37.571355 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.571334 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj"] Apr 23 08:16:37.575874 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:37.575854 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode488b11b_324c_4989_b28f_8aaa6ecd0cab.slice/crio-01d6128f408d1e8533a537d61790dea433ed218b1e00464abeab0a07b7fa94fd WatchSource:0}: Error finding container 01d6128f408d1e8533a537d61790dea433ed218b1e00464abeab0a07b7fa94fd: Status 404 returned error can't find the container with id 01d6128f408d1e8533a537d61790dea433ed218b1e00464abeab0a07b7fa94fd Apr 23 08:16:37.576822 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.576803 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:37.582650 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.582630 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" Apr 23 08:16:37.700005 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.699977 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm"] Apr 23 08:16:37.702656 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:37.702630 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd38a2fb0_c776_4a02_95f3_1c68963e1ef7.slice/crio-29dc66f910d73e42f6e7f0fd336750b3bf9d8928517e454a52d663000531fb52 WatchSource:0}: Error finding container 29dc66f910d73e42f6e7f0fd336750b3bf9d8928517e454a52d663000531fb52: Status 404 returned error can't find the container with id 29dc66f910d73e42f6e7f0fd336750b3bf9d8928517e454a52d663000531fb52 Apr 23 08:16:37.928350 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.928325 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q"] Apr 23 08:16:37.930794 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:37.930766 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42ca4f9_a9ae_4413_9c3d_fa18098d565a.slice/crio-a22cba0d32f501709b368e8e4987c42a43348608e611cf2640a31430393817fe WatchSource:0}: Error finding container a22cba0d32f501709b368e8e4987c42a43348608e611cf2640a31430393817fe: Status 404 returned error can't find the container with id a22cba0d32f501709b368e8e4987c42a43348608e611cf2640a31430393817fe Apr 23 08:16:37.931969 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.931928 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-94fwt"] Apr 23 08:16:37.935022 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:37.935001 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd460c450_63cc_49ec_af6a_6618277ea5cf.slice/crio-b201bb2c02987ac4267ec8324082bc3f949bf68f2e49d84ec310955ef11cb467 WatchSource:0}: Error finding container b201bb2c02987ac4267ec8324082bc3f949bf68f2e49d84ec310955ef11cb467: Status 404 returned error can't find the container with id b201bb2c02987ac4267ec8324082bc3f949bf68f2e49d84ec310955ef11cb467 Apr 23 08:16:37.967330 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.967252 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:37.967437 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:37.967417 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:16:37.967498 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:37.967491 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls podName:ad2217d3-d0c5-4c5e-a920-ef66096532e4 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:38.967469169 +0000 UTC m=+120.898831950 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f27qz" (UID: "ad2217d3-d0c5-4c5e-a920-ef66096532e4") : secret "samples-operator-tls" not found Apr 23 08:16:37.976843 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.976820 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" event={"ID":"a42ca4f9-a9ae-4413-9c3d-fa18098d565a","Type":"ContainerStarted","Data":"a22cba0d32f501709b368e8e4987c42a43348608e611cf2640a31430393817fe"} Apr 23 08:16:37.977846 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.977820 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" event={"ID":"d460c450-63cc-49ec-af6a-6618277ea5cf","Type":"ContainerStarted","Data":"b201bb2c02987ac4267ec8324082bc3f949bf68f2e49d84ec310955ef11cb467"} Apr 23 08:16:37.978718 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.978698 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" event={"ID":"d38a2fb0-c776-4a02-95f3-1c68963e1ef7","Type":"ContainerStarted","Data":"29dc66f910d73e42f6e7f0fd336750b3bf9d8928517e454a52d663000531fb52"} Apr 23 08:16:37.979649 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:37.979631 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj" event={"ID":"e488b11b-324c-4989-b28f-8aaa6ecd0cab","Type":"ContainerStarted","Data":"01d6128f408d1e8533a537d61790dea433ed218b1e00464abeab0a07b7fa94fd"} Apr 23 08:16:38.975682 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:38.975519 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:38.976096 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:38.975737 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:16:38.976096 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:38.975823 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls podName:ad2217d3-d0c5-4c5e-a920-ef66096532e4 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:40.975801956 +0000 UTC m=+122.907164748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f27qz" (UID: "ad2217d3-d0c5-4c5e-a920-ef66096532e4") : secret "samples-operator-tls" not found Apr 23 08:16:40.993496 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:40.993469 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:40.993851 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:40.993624 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:16:40.993851 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:40.993694 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls podName:ad2217d3-d0c5-4c5e-a920-ef66096532e4 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:44.99367425 +0000 UTC m=+126.925037046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f27qz" (UID: "ad2217d3-d0c5-4c5e-a920-ef66096532e4") : secret "samples-operator-tls" not found Apr 23 08:16:41.988744 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:41.988704 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" event={"ID":"d38a2fb0-c776-4a02-95f3-1c68963e1ef7","Type":"ContainerStarted","Data":"4bab73a84a6d6da982c3daf6b66d66b56a070cc1ec4de0e6c51a3f121bb0b1f4"} Apr 23 08:16:41.990063 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:41.990030 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj" event={"ID":"e488b11b-324c-4989-b28f-8aaa6ecd0cab","Type":"ContainerStarted","Data":"fe96d12b4eae342f431e7fea84c59697904686efdef662a008d246cd313a532c"} Apr 23 08:16:41.991797 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:41.991463 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" event={"ID":"a42ca4f9-a9ae-4413-9c3d-fa18098d565a","Type":"ContainerStarted","Data":"727a231f06d78f1084fd3931bef25e3060bc82255f37e871aee9d11419881742"} Apr 23 08:16:41.994759 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:41.994740 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/0.log" Apr 23 08:16:41.995043 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:41.994772 2561 generic.go:358] "Generic (PLEG): container finished" podID="d460c450-63cc-49ec-af6a-6618277ea5cf" containerID="93397b36cdede397252a53945510af5d8806f44847ccc19fd3b6e612b13b7644" exitCode=255 Apr 23 08:16:41.995043 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:41.994814 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" event={"ID":"d460c450-63cc-49ec-af6a-6618277ea5cf","Type":"ContainerDied","Data":"93397b36cdede397252a53945510af5d8806f44847ccc19fd3b6e612b13b7644"} Apr 23 08:16:41.995043 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:41.994983 2561 scope.go:117] "RemoveContainer" containerID="93397b36cdede397252a53945510af5d8806f44847ccc19fd3b6e612b13b7644" Apr 23 08:16:42.005961 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.005924 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" podStartSLOduration=1.7730339929999999 podStartE2EDuration="5.005913203s" podCreationTimestamp="2026-04-23 08:16:37 +0000 UTC" firstStartedPulling="2026-04-23 08:16:37.704881343 +0000 UTC m=+119.636244124" lastFinishedPulling="2026-04-23 08:16:40.937760541 +0000 UTC m=+122.869123334" observedRunningTime="2026-04-23 08:16:42.004861311 +0000 UTC m=+123.936224115" watchObservedRunningTime="2026-04-23 08:16:42.005913203 +0000 UTC m=+123.937276040" Apr 23 08:16:42.043443 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.043328 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jmnkj" podStartSLOduration=1.688168783 podStartE2EDuration="5.043313667s" podCreationTimestamp="2026-04-23 08:16:37 +0000 UTC" firstStartedPulling="2026-04-23 08:16:37.577896823 +0000 UTC m=+119.509259615" lastFinishedPulling="2026-04-23 08:16:40.933041705 +0000 UTC m=+122.864404499" observedRunningTime="2026-04-23 08:16:42.042552516 +0000 UTC m=+123.973915318" watchObservedRunningTime="2026-04-23 08:16:42.043313667 +0000 UTC m=+123.974676466" Apr 23 08:16:42.059687 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.059643 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" podStartSLOduration=2.049579455 podStartE2EDuration="5.059630562s" podCreationTimestamp="2026-04-23 08:16:37 +0000 UTC" firstStartedPulling="2026-04-23 08:16:37.933007961 +0000 UTC m=+119.864370744" lastFinishedPulling="2026-04-23 08:16:40.943059068 +0000 UTC m=+122.874421851" observedRunningTime="2026-04-23 08:16:42.058640522 +0000 UTC m=+123.990003326" watchObservedRunningTime="2026-04-23 08:16:42.059630562 +0000 UTC m=+123.990993362" Apr 23 08:16:42.709998 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.709958 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn"] Apr 23 08:16:42.712933 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.712911 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" Apr 23 08:16:42.715602 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.715580 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 08:16:42.715679 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.715580 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 08:16:42.716682 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.716667 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6xwww\"" Apr 23 08:16:42.722371 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.722347 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn"] Apr 23 08:16:42.809157 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.809123 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zsx\" (UniqueName: \"kubernetes.io/projected/7097634a-9704-4d5b-a292-4c6376bde24a-kube-api-access-s5zsx\") pod \"migrator-74bb7799d9-pjbfn\" (UID: \"7097634a-9704-4d5b-a292-4c6376bde24a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" Apr 23 08:16:42.910461 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.910419 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zsx\" (UniqueName: \"kubernetes.io/projected/7097634a-9704-4d5b-a292-4c6376bde24a-kube-api-access-s5zsx\") pod \"migrator-74bb7799d9-pjbfn\" (UID: \"7097634a-9704-4d5b-a292-4c6376bde24a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" Apr 23 08:16:42.918865 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.918843 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zsx\" (UniqueName: \"kubernetes.io/projected/7097634a-9704-4d5b-a292-4c6376bde24a-kube-api-access-s5zsx\") pod \"migrator-74bb7799d9-pjbfn\" (UID: \"7097634a-9704-4d5b-a292-4c6376bde24a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" Apr 23 08:16:42.998848 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.998793 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:16:42.999158 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.999145 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/0.log" Apr 23 08:16:42.999196 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.999177 2561 generic.go:358] "Generic (PLEG): container finished" podID="d460c450-63cc-49ec-af6a-6618277ea5cf" containerID="1328b3731263bbac732ebcfa49f9349fb141af7de5bb6c04fff478de49502f80" exitCode=255 Apr 23 08:16:42.999330 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.999301 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" event={"ID":"d460c450-63cc-49ec-af6a-6618277ea5cf","Type":"ContainerDied","Data":"1328b3731263bbac732ebcfa49f9349fb141af7de5bb6c04fff478de49502f80"} Apr 23 08:16:42.999449 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.999343 2561 scope.go:117] "RemoveContainer" containerID="93397b36cdede397252a53945510af5d8806f44847ccc19fd3b6e612b13b7644" Apr 23 08:16:42.999532 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:42.999512 2561 scope.go:117] "RemoveContainer" containerID="1328b3731263bbac732ebcfa49f9349fb141af7de5bb6c04fff478de49502f80" Apr 23 08:16:42.999726 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:42.999702 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-94fwt_openshift-console-operator(d460c450-63cc-49ec-af6a-6618277ea5cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" podUID="d460c450-63cc-49ec-af6a-6618277ea5cf" Apr 23 08:16:43.011802 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:43.011774 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:43.011928 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:43.011913 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:43.011987 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:43.011978 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls podName:2c1dd227-3279-4f30-b918-473a6a080619 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:59.011963056 +0000 UTC m=+140.943325834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cbh4v" (UID: "2c1dd227-3279-4f30-b918-473a6a080619") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:16:43.022054 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:43.022038 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" Apr 23 08:16:43.134117 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:43.134091 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn"] Apr 23 08:16:43.136904 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:43.136879 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7097634a_9704_4d5b_a292_4c6376bde24a.slice/crio-4e82eca4eca1a918a778a977f7bd1dacf84821c7e55b8c6683a5267f164b7eea WatchSource:0}: Error finding container 4e82eca4eca1a918a778a977f7bd1dacf84821c7e55b8c6683a5267f164b7eea: Status 404 returned error can't find the container with id 4e82eca4eca1a918a778a977f7bd1dacf84821c7e55b8c6683a5267f164b7eea Apr 23 08:16:44.003518 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.003479 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" event={"ID":"7097634a-9704-4d5b-a292-4c6376bde24a","Type":"ContainerStarted","Data":"4e82eca4eca1a918a778a977f7bd1dacf84821c7e55b8c6683a5267f164b7eea"} Apr 23 08:16:44.004929 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.004907 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:16:44.005398 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.005378 2561 scope.go:117] "RemoveContainer" containerID="1328b3731263bbac732ebcfa49f9349fb141af7de5bb6c04fff478de49502f80" Apr 23 08:16:44.005580 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:44.005560 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-94fwt_openshift-console-operator(d460c450-63cc-49ec-af6a-6618277ea5cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" podUID="d460c450-63cc-49ec-af6a-6618277ea5cf" Apr 23 08:16:44.227040 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.227006 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-6bhbs"] Apr 23 08:16:44.228879 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.228860 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.232223 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.232198 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 08:16:44.232358 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.232243 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jrz9m\"" Apr 23 08:16:44.232358 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.232243 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 08:16:44.232358 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.232350 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 08:16:44.232560 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.232549 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 08:16:44.240188 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.240153 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-6bhbs"] Apr 23 08:16:44.321212 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.321143 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-signing-key\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.321212 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.321201 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwndt\" (UniqueName: \"kubernetes.io/projected/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-kube-api-access-bwndt\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.321380 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.321338 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-signing-cabundle\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.417581 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.417548 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gvqs5"] Apr 23 08:16:44.419838 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.419815 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.422245 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.422219 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwndt\" (UniqueName: \"kubernetes.io/projected/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-kube-api-access-bwndt\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.422460 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.422429 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-signing-cabundle\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.422584 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.422496 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-signing-key\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.423246 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.423209 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-signing-cabundle\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.424120 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.424097 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-npvv9\"" Apr 23 08:16:44.424613 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.424582 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:16:44.424724 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.424657 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:16:44.425418 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.425395 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-signing-key\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.434638 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.434610 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gvqs5"] Apr 23 08:16:44.439885 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.439858 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwndt\" (UniqueName: \"kubernetes.io/projected/7dc68295-1c6a-4ff7-a186-4c27b4e63f84-kube-api-access-bwndt\") pod \"service-ca-865cb79987-6bhbs\" (UID: \"7dc68295-1c6a-4ff7-a186-4c27b4e63f84\") " pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.523248 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.523216 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.523420 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.523278 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.523420 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.523338 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-data-volume\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.523420 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.523410 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-crio-socket\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.523586 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.523438 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89wsk\" (UniqueName: \"kubernetes.io/projected/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-kube-api-access-89wsk\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.539199 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.539170 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-6bhbs" Apr 23 08:16:44.624880 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.624842 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.625009 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.624893 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.625009 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.624939 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-data-volume\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.625009 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:44.624995 2561 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:44.625179 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.625008 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-crio-socket\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.625179 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.625054 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89wsk\" (UniqueName: \"kubernetes.io/projected/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-kube-api-access-89wsk\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.625179 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:44.625080 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls podName:34496dd2-18a1-4fe2-a3be-b2d24e4bd928 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:45.125057398 +0000 UTC m=+127.056420190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gvqs5" (UID: "34496dd2-18a1-4fe2-a3be-b2d24e4bd928") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:44.625513 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.625465 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-crio-socket\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.625769 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.625665 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-data-volume\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.626393 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.626372 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.634161 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.634144 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89wsk\" (UniqueName: \"kubernetes.io/projected/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-kube-api-access-89wsk\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:44.669917 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:44.669891 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-6bhbs"] Apr 23 08:16:44.673407 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:44.673379 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dc68295_1c6a_4ff7_a186_4c27b4e63f84.slice/crio-f3ea88f993aa14886bbc75310d51998017812057303c0e517be7040b47381b48 WatchSource:0}: Error finding container f3ea88f993aa14886bbc75310d51998017812057303c0e517be7040b47381b48: Status 404 returned error can't find the container with id f3ea88f993aa14886bbc75310d51998017812057303c0e517be7040b47381b48 Apr 23 08:16:45.009039 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:45.009004 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-6bhbs" event={"ID":"7dc68295-1c6a-4ff7-a186-4c27b4e63f84","Type":"ContainerStarted","Data":"dfafc3a721251804eccfd6e1da161f166ee7fa80666ef27f762b380e048212a1"} Apr 23 08:16:45.009039 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:45.009038 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-6bhbs" event={"ID":"7dc68295-1c6a-4ff7-a186-4c27b4e63f84","Type":"ContainerStarted","Data":"f3ea88f993aa14886bbc75310d51998017812057303c0e517be7040b47381b48"} Apr 23 08:16:45.010686 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:45.010654 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" event={"ID":"7097634a-9704-4d5b-a292-4c6376bde24a","Type":"ContainerStarted","Data":"06f6e4c862439f44474653cbf8ac25a5094165c739ce03086a87211ede1e21d2"} Apr 23 08:16:45.010686 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:45.010684 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" event={"ID":"7097634a-9704-4d5b-a292-4c6376bde24a","Type":"ContainerStarted","Data":"98281c462d8e2d8739159754c9b35a46251eaf87c07e0c68c53371fcd958430f"} Apr 23 08:16:45.028676 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:45.028650 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:45.028826 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:45.028809 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:16:45.028883 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:45.028875 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls podName:ad2217d3-d0c5-4c5e-a920-ef66096532e4 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:53.028858012 +0000 UTC m=+134.960220809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-f27qz" (UID: "ad2217d3-d0c5-4c5e-a920-ef66096532e4") : secret "samples-operator-tls" not found Apr 23 08:16:45.030489 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:45.030455 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-6bhbs" podStartSLOduration=1.030444002 podStartE2EDuration="1.030444002s" podCreationTimestamp="2026-04-23 08:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:16:45.028839786 +0000 UTC m=+126.960202588" watchObservedRunningTime="2026-04-23 08:16:45.030444002 +0000 UTC m=+126.961806803" Apr 23 08:16:45.050844 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:45.050804 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pjbfn" podStartSLOduration=1.635602446 podStartE2EDuration="3.050792784s" podCreationTimestamp="2026-04-23 08:16:42 +0000 UTC" firstStartedPulling="2026-04-23 08:16:43.139158287 +0000 UTC m=+125.070521068" lastFinishedPulling="2026-04-23 08:16:44.554348624 +0000 UTC m=+126.485711406" observedRunningTime="2026-04-23 08:16:45.049915202 +0000 UTC m=+126.981278006" watchObservedRunningTime="2026-04-23 08:16:45.050792784 +0000 UTC m=+126.982155618" Apr 23 08:16:45.129915 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:45.129891 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:45.130101 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:45.130072 2561 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:45.130152 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:45.130135 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls podName:34496dd2-18a1-4fe2-a3be-b2d24e4bd928 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:46.130121279 +0000 UTC m=+128.061484075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gvqs5" (UID: "34496dd2-18a1-4fe2-a3be-b2d24e4bd928") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:46.138869 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:46.138828 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:46.139513 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:46.139493 2561 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:46.139591 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:46.139563 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls podName:34496dd2-18a1-4fe2-a3be-b2d24e4bd928 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:48.139543937 +0000 UTC m=+130.070906720 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gvqs5" (UID: "34496dd2-18a1-4fe2-a3be-b2d24e4bd928") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:47.577126 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:47.577087 2561 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:47.577126 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:47.577131 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:16:47.577633 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:47.577489 2561 scope.go:117] "RemoveContainer" containerID="1328b3731263bbac732ebcfa49f9349fb141af7de5bb6c04fff478de49502f80" Apr 23 08:16:47.577669 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:47.577637 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-94fwt_openshift-console-operator(d460c450-63cc-49ec-af6a-6618277ea5cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" podUID="d460c450-63cc-49ec-af6a-6618277ea5cf" Apr 23 08:16:48.158537 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:48.158498 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:48.158711 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:48.158608 2561 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:48.158711 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:48.158661 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls podName:34496dd2-18a1-4fe2-a3be-b2d24e4bd928 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:52.158646643 +0000 UTC m=+134.090009421 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls") pod "insights-runtime-extractor-gvqs5" (UID: "34496dd2-18a1-4fe2-a3be-b2d24e4bd928") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:48.461717 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:48.461685 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:16:48.461880 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:48.461849 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:16:48.461932 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:16:48.461923 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs podName:e92a791e-42ac-4855-b7b5-945f53108891 nodeName:}" failed. No retries permitted until 2026-04-23 08:18:50.461903397 +0000 UTC m=+252.393266179 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs") pod "network-metrics-daemon-pmv55" (UID: "e92a791e-42ac-4855-b7b5-945f53108891") : secret "metrics-daemon-secret" not found Apr 23 08:16:52.191432 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:52.191400 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:52.193663 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:52.193634 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/34496dd2-18a1-4fe2-a3be-b2d24e4bd928-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvqs5\" (UID: \"34496dd2-18a1-4fe2-a3be-b2d24e4bd928\") " pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:52.237251 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:52.237228 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gvqs5" Apr 23 08:16:52.365454 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:52.365314 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gvqs5"] Apr 23 08:16:52.367475 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:52.367437 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34496dd2_18a1_4fe2_a3be_b2d24e4bd928.slice/crio-bed3a073dbc455d971081d11c71e347cac64fc51c01d7b4c0caf649f571d77f0 WatchSource:0}: Error finding container bed3a073dbc455d971081d11c71e347cac64fc51c01d7b4c0caf649f571d77f0: Status 404 returned error can't find the container with id bed3a073dbc455d971081d11c71e347cac64fc51c01d7b4c0caf649f571d77f0 Apr 23 08:16:53.032558 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:53.032523 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvqs5" event={"ID":"34496dd2-18a1-4fe2-a3be-b2d24e4bd928","Type":"ContainerStarted","Data":"df31882e099edac345ca546390a7f29182fea7ea95737402271c42558f8701ee"} Apr 23 08:16:53.032736 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:53.032566 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvqs5" event={"ID":"34496dd2-18a1-4fe2-a3be-b2d24e4bd928","Type":"ContainerStarted","Data":"bed3a073dbc455d971081d11c71e347cac64fc51c01d7b4c0caf649f571d77f0"} Apr 23 08:16:53.100633 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:53.100593 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:53.103667 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:53.103613 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad2217d3-d0c5-4c5e-a920-ef66096532e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-f27qz\" (UID: \"ad2217d3-d0c5-4c5e-a920-ef66096532e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:53.162655 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:53.162634 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-c2nnb\"" Apr 23 08:16:53.170470 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:53.170448 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" Apr 23 08:16:53.283230 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:53.283168 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz"] Apr 23 08:16:54.037161 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:54.037100 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" event={"ID":"ad2217d3-d0c5-4c5e-a920-ef66096532e4","Type":"ContainerStarted","Data":"7838e76b3e189e2a43365516bc3f193d921fdad57dde21146b2b8389009cd72f"} Apr 23 08:16:54.039085 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:54.039053 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvqs5" event={"ID":"34496dd2-18a1-4fe2-a3be-b2d24e4bd928","Type":"ContainerStarted","Data":"e3e84eb7a050a9372d6bc1d7d941274d9c7bc77b63c40366eedf07ff78057bb8"} Apr 23 08:16:56.044581 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:56.044542 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" event={"ID":"ad2217d3-d0c5-4c5e-a920-ef66096532e4","Type":"ContainerStarted","Data":"649c124a86f7000476dc6f173c017fb740b302e9ef7a10e2449125eda47e219f"} Apr 23 08:16:56.044581 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:56.044583 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" event={"ID":"ad2217d3-d0c5-4c5e-a920-ef66096532e4","Type":"ContainerStarted","Data":"ea0c9e7e2d9499e52a194bdd7ca3d9ec024a1d04524bc2554cb35e14dc01c485"} Apr 23 08:16:56.046286 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:56.046245 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvqs5" event={"ID":"34496dd2-18a1-4fe2-a3be-b2d24e4bd928","Type":"ContainerStarted","Data":"bde2877ac0773747475661d42896e142c766cb1800316ca07a793e0d36e278e4"} Apr 23 08:16:56.063607 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:56.063570 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-f27qz" podStartSLOduration=16.989488615 podStartE2EDuration="19.063559207s" podCreationTimestamp="2026-04-23 08:16:37 +0000 UTC" firstStartedPulling="2026-04-23 08:16:53.319653726 +0000 UTC m=+135.251016510" lastFinishedPulling="2026-04-23 08:16:55.393724319 +0000 UTC m=+137.325087102" observedRunningTime="2026-04-23 08:16:56.063399061 +0000 UTC m=+137.994761864" watchObservedRunningTime="2026-04-23 08:16:56.063559207 +0000 UTC m=+137.994922007" Apr 23 08:16:56.079671 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:56.079631 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gvqs5" podStartSLOduration=9.102002135 podStartE2EDuration="12.079621121s" podCreationTimestamp="2026-04-23 08:16:44 +0000 UTC" firstStartedPulling="2026-04-23 08:16:52.416996498 +0000 UTC m=+134.348359291" lastFinishedPulling="2026-04-23 08:16:55.394615496 +0000 UTC m=+137.325978277" observedRunningTime="2026-04-23 08:16:56.078876727 +0000 UTC m=+138.010239561" watchObservedRunningTime="2026-04-23 08:16:56.079621121 +0000 UTC m=+138.010983922" Apr 23 08:16:59.049872 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:59.049821 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:59.052182 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:59.052157 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c1dd227-3279-4f30-b918-473a6a080619-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cbh4v\" (UID: \"2c1dd227-3279-4f30-b918-473a6a080619\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:59.294374 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:59.294344 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-49dql\"" Apr 23 08:16:59.301663 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:59.301603 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" Apr 23 08:16:59.415057 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:16:59.415031 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v"] Apr 23 08:16:59.417648 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:16:59.417624 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1dd227_3279_4f30_b918_473a6a080619.slice/crio-9014e815e7dae273c2a94241e9acf6fef13beb5ff23996d998e7461b34eb4c0a WatchSource:0}: Error finding container 9014e815e7dae273c2a94241e9acf6fef13beb5ff23996d998e7461b34eb4c0a: Status 404 returned error can't find the container with id 9014e815e7dae273c2a94241e9acf6fef13beb5ff23996d998e7461b34eb4c0a Apr 23 08:17:00.056816 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:00.056777 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" event={"ID":"2c1dd227-3279-4f30-b918-473a6a080619","Type":"ContainerStarted","Data":"9014e815e7dae273c2a94241e9acf6fef13beb5ff23996d998e7461b34eb4c0a"} Apr 23 08:17:02.062592 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:02.062555 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" event={"ID":"2c1dd227-3279-4f30-b918-473a6a080619","Type":"ContainerStarted","Data":"b41ed0e2896428b75a3353c528da2e81d36504b6992964875e1d97fc8ced0b1b"} Apr 23 08:17:02.080153 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:02.080108 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cbh4v" podStartSLOduration=33.344055422 podStartE2EDuration="35.080094699s" podCreationTimestamp="2026-04-23 08:16:27 +0000 UTC" firstStartedPulling="2026-04-23 08:16:59.419485833 +0000 UTC m=+141.350848615" lastFinishedPulling="2026-04-23 08:17:01.155525108 +0000 UTC m=+143.086887892" observedRunningTime="2026-04-23 08:17:02.07934765 +0000 UTC m=+144.010710451" watchObservedRunningTime="2026-04-23 08:17:02.080094699 +0000 UTC m=+144.011457501" Apr 23 08:17:02.660618 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:02.660584 2561 scope.go:117] "RemoveContainer" containerID="1328b3731263bbac732ebcfa49f9349fb141af7de5bb6c04fff478de49502f80" Apr 23 08:17:03.066125 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:03.066101 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:17:03.066504 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:03.066155 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" event={"ID":"d460c450-63cc-49ec-af6a-6618277ea5cf","Type":"ContainerStarted","Data":"93401e1d591c3ca29d347a488fcc63bb82c2725e34994f81bc7da3240ba8567d"} Apr 23 08:17:03.083891 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:03.083850 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" podStartSLOduration=23.080979319 podStartE2EDuration="26.083839028s" podCreationTimestamp="2026-04-23 08:16:37 +0000 UTC" firstStartedPulling="2026-04-23 08:16:37.936587124 +0000 UTC m=+119.867949909" lastFinishedPulling="2026-04-23 08:16:40.939446839 +0000 UTC m=+122.870809618" observedRunningTime="2026-04-23 08:17:03.083143257 +0000 UTC m=+145.014506069" watchObservedRunningTime="2026-04-23 08:17:03.083839028 +0000 UTC m=+145.015201828" Apr 23 08:17:05.092191 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.092162 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh"] Apr 23 08:17:05.094194 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.094165 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5"] Apr 23 08:17:05.094334 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.094301 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh" Apr 23 08:17:05.096090 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.096069 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" Apr 23 08:17:05.097677 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.097658 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4h2pl\"" Apr 23 08:17:05.098895 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.098877 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 08:17:05.099030 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.099013 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-z7d65\"" Apr 23 08:17:05.109458 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.109422 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh"] Apr 23 08:17:05.110653 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.110633 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5"] Apr 23 08:17:05.197793 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.197764 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/26e34e7b-74c5-44ab-a606-d279a8dc3619-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ksft5\" (UID: \"26e34e7b-74c5-44ab-a606-d279a8dc3619\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" Apr 23 08:17:05.197944 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.197848 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fplm2\" (UniqueName: \"kubernetes.io/projected/fc7bcc2c-0662-49d5-846d-e6a5358d369a-kube-api-access-fplm2\") pod \"network-check-source-8894fc9bd-cqjnh\" (UID: \"fc7bcc2c-0662-49d5-846d-e6a5358d369a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh" Apr 23 08:17:05.298608 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.298579 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/26e34e7b-74c5-44ab-a606-d279a8dc3619-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ksft5\" (UID: \"26e34e7b-74c5-44ab-a606-d279a8dc3619\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" Apr 23 08:17:05.298749 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.298691 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fplm2\" (UniqueName: \"kubernetes.io/projected/fc7bcc2c-0662-49d5-846d-e6a5358d369a-kube-api-access-fplm2\") pod \"network-check-source-8894fc9bd-cqjnh\" (UID: \"fc7bcc2c-0662-49d5-846d-e6a5358d369a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh" Apr 23 08:17:05.301463 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.301445 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/26e34e7b-74c5-44ab-a606-d279a8dc3619-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ksft5\" (UID: \"26e34e7b-74c5-44ab-a606-d279a8dc3619\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" Apr 23 08:17:05.312564 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.312544 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fplm2\" (UniqueName: \"kubernetes.io/projected/fc7bcc2c-0662-49d5-846d-e6a5358d369a-kube-api-access-fplm2\") pod \"network-check-source-8894fc9bd-cqjnh\" (UID: \"fc7bcc2c-0662-49d5-846d-e6a5358d369a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh" Apr 23 08:17:05.404670 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.404608 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh" Apr 23 08:17:05.412615 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.412592 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" Apr 23 08:17:05.543615 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.543503 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh"] Apr 23 08:17:05.545809 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:05.545782 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc7bcc2c_0662_49d5_846d_e6a5358d369a.slice/crio-b016d63b519829ffcbe486d29034a165018e83daeb838ab73ed19d220916d19d WatchSource:0}: Error finding container b016d63b519829ffcbe486d29034a165018e83daeb838ab73ed19d220916d19d: Status 404 returned error can't find the container with id b016d63b519829ffcbe486d29034a165018e83daeb838ab73ed19d220916d19d Apr 23 08:17:05.553915 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:05.553878 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5"] Apr 23 08:17:05.557043 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:05.557017 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e34e7b_74c5_44ab_a606_d279a8dc3619.slice/crio-45f966584a34d6fb889af94118578e0b9ba8851bd2f1c2ec15a2b341ae9cf34c WatchSource:0}: Error finding container 45f966584a34d6fb889af94118578e0b9ba8851bd2f1c2ec15a2b341ae9cf34c: Status 404 returned error can't find the container with id 45f966584a34d6fb889af94118578e0b9ba8851bd2f1c2ec15a2b341ae9cf34c Apr 23 08:17:06.073933 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:06.073898 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" event={"ID":"26e34e7b-74c5-44ab-a606-d279a8dc3619","Type":"ContainerStarted","Data":"45f966584a34d6fb889af94118578e0b9ba8851bd2f1c2ec15a2b341ae9cf34c"} Apr 23 08:17:06.075160 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:06.075139 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh" event={"ID":"fc7bcc2c-0662-49d5-846d-e6a5358d369a","Type":"ContainerStarted","Data":"1acf3e3d4e43ecb0976798250eac7add733fa805857f3b65c7c41340d9c14036"} Apr 23 08:17:06.075297 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:06.075163 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh" event={"ID":"fc7bcc2c-0662-49d5-846d-e6a5358d369a","Type":"ContainerStarted","Data":"b016d63b519829ffcbe486d29034a165018e83daeb838ab73ed19d220916d19d"} Apr 23 08:17:06.090320 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:06.090281 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cqjnh" podStartSLOduration=1.090256257 podStartE2EDuration="1.090256257s" podCreationTimestamp="2026-04-23 08:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:17:06.089707346 +0000 UTC m=+148.021070146" watchObservedRunningTime="2026-04-23 08:17:06.090256257 +0000 UTC m=+148.021619120" Apr 23 08:17:07.079634 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:07.079594 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" event={"ID":"26e34e7b-74c5-44ab-a606-d279a8dc3619","Type":"ContainerStarted","Data":"16dbbb1e0199d91de82bcd1d04b0b188f126f15a79e6569ddcb3b07f41080bb3"} Apr 23 08:17:07.096505 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:07.096453 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" podStartSLOduration=0.973347762 podStartE2EDuration="2.096437683s" podCreationTimestamp="2026-04-23 08:17:05 +0000 UTC" firstStartedPulling="2026-04-23 08:17:05.558801882 +0000 UTC m=+147.490164663" lastFinishedPulling="2026-04-23 08:17:06.681891802 +0000 UTC m=+148.613254584" observedRunningTime="2026-04-23 08:17:07.094534551 +0000 UTC m=+149.025897355" watchObservedRunningTime="2026-04-23 08:17:07.096437683 +0000 UTC m=+149.027800488" Apr 23 08:17:08.082307 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:08.082257 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" Apr 23 08:17:08.086573 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:08.086556 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ksft5" Apr 23 08:17:13.066781 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:13.066749 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:17:13.070398 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:13.070372 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-94fwt" Apr 23 08:17:14.116056 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.116025 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nl62j"] Apr 23 08:17:14.121526 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.121504 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.123919 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.123899 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-jn6vp\"" Apr 23 08:17:14.124076 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.124056 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:17:14.124151 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.124056 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:17:14.124229 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.124208 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:17:14.124939 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.124925 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:17:14.168817 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.168794 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-root\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.169003 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.168986 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwqg\" (UniqueName: \"kubernetes.io/projected/5b7a07bf-2318-478f-9149-ee5a0395ef3f-kube-api-access-mlwqg\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.169149 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.169135 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b7a07bf-2318-478f-9149-ee5a0395ef3f-metrics-client-ca\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.169300 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.169283 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-wtmp\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.169548 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.169531 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-tls\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.169724 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.169709 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-accelerators-collector-config\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.169859 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.169846 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-sys\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.170033 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.169985 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-textfile\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.170156 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.170141 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.270893 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.270861 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwqg\" (UniqueName: \"kubernetes.io/projected/5b7a07bf-2318-478f-9149-ee5a0395ef3f-kube-api-access-mlwqg\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.270893 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.270895 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b7a07bf-2318-478f-9149-ee5a0395ef3f-metrics-client-ca\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271112 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.270915 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-wtmp\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271112 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271049 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-wtmp\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271112 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271072 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-tls\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271324 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271119 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-accelerators-collector-config\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271324 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271145 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-sys\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271324 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271180 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-textfile\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271324 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271222 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271324 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271229 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-sys\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271324 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271246 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-root\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271723 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271348 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5b7a07bf-2318-478f-9149-ee5a0395ef3f-root\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271723 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271605 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-textfile\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271723 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271606 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b7a07bf-2318-478f-9149-ee5a0395ef3f-metrics-client-ca\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.271723 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.271703 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-accelerators-collector-config\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.273533 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.273513 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.273666 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.273648 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5b7a07bf-2318-478f-9149-ee5a0395ef3f-node-exporter-tls\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.278205 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.278186 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwqg\" (UniqueName: \"kubernetes.io/projected/5b7a07bf-2318-478f-9149-ee5a0395ef3f-kube-api-access-mlwqg\") pod \"node-exporter-nl62j\" (UID: \"5b7a07bf-2318-478f-9149-ee5a0395ef3f\") " pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.431339 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:14.431253 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nl62j" Apr 23 08:17:14.438975 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:14.438952 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b7a07bf_2318_478f_9149_ee5a0395ef3f.slice/crio-e2b1012b7c4855e633dceb8223f5722be7fcb0514488d524936ca7ae39592c04 WatchSource:0}: Error finding container e2b1012b7c4855e633dceb8223f5722be7fcb0514488d524936ca7ae39592c04: Status 404 returned error can't find the container with id e2b1012b7c4855e633dceb8223f5722be7fcb0514488d524936ca7ae39592c04 Apr 23 08:17:14.469528 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:17:14.469501 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-46nvx" podUID="3ada2676-04c4-4126-a943-cd1d167949aa" Apr 23 08:17:14.475869 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:17:14.475840 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-ttph8" podUID="c7c0ad21-b2af-4a80-a79c-000cff3a91ab" Apr 23 08:17:15.105557 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.105517 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nl62j" event={"ID":"5b7a07bf-2318-478f-9149-ee5a0395ef3f","Type":"ContainerStarted","Data":"e2b1012b7c4855e633dceb8223f5722be7fcb0514488d524936ca7ae39592c04"} Apr 23 08:17:15.105736 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.105610 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:17:15.105736 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.105610 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-46nvx" Apr 23 08:17:15.144768 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.144743 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:17:15.149864 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.149846 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.152603 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.152395 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 08:17:15.152603 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.152478 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 08:17:15.152603 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.152477 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 08:17:15.152603 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.152547 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 08:17:15.152898 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.152405 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 08:17:15.152898 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.152715 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 08:17:15.152898 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.152788 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 08:17:15.153323 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.153303 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-z9nvw\"" Apr 23 08:17:15.153420 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.153329 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 08:17:15.153550 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.153534 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 08:17:15.164610 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.164591 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:17:15.279727 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.279705 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.279831 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.279740 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.279831 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.279797 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-web-config\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.279831 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.279825 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.279993 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.279850 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-config-volume\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.279993 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.279871 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.279993 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.279898 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.279993 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.279967 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.280180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.280007 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64ngt\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-kube-api-access-64ngt\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.280180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.280060 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.280180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.280095 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.280180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.280144 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-config-out\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.280180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.280165 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381504 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381440 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381504 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381490 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-config-volume\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381641 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381610 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381697 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381641 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381697 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381664 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381796 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381770 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64ngt\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-kube-api-access-64ngt\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381844 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381798 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381844 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381809 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381951 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381859 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381951 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381909 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-config-out\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.381951 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381941 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.382106 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.381974 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.382106 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.382004 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.382106 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.382098 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-web-config\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.382637 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.382611 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.383443 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.383416 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.384707 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.384553 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.384806 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.384756 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.384866 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.384830 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-config-volume\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.385443 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.385412 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.385620 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.385597 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.385696 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.385619 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-web-config\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.385914 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.385886 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.386076 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.386058 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.386419 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.386400 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-config-out\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.395407 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.395379 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64ngt\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-kube-api-access-64ngt\") pod \"alertmanager-main-0\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.461846 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.461820 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:17:15.584588 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:15.584506 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:17:15.587946 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:15.587923 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb15804e_aa87_4672_93ff_da7e97217b1f.slice/crio-b5a2489d82692a94602a675235188e5c8115e2e8a7b10b135cf6a304eb7e4ab2 WatchSource:0}: Error finding container b5a2489d82692a94602a675235188e5c8115e2e8a7b10b135cf6a304eb7e4ab2: Status 404 returned error can't find the container with id b5a2489d82692a94602a675235188e5c8115e2e8a7b10b135cf6a304eb7e4ab2 Apr 23 08:17:15.671191 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:17:15.671129 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-pmv55" podUID="e92a791e-42ac-4855-b7b5-945f53108891" Apr 23 08:17:16.108706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.108667 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerStarted","Data":"b5a2489d82692a94602a675235188e5c8115e2e8a7b10b135cf6a304eb7e4ab2"} Apr 23 08:17:16.110018 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.109994 2561 generic.go:358] "Generic (PLEG): container finished" podID="5b7a07bf-2318-478f-9149-ee5a0395ef3f" containerID="a37ed84ca9593f5527359420c80362cca55a9c3d53012e9661994f3caf5eaa41" exitCode=0 Apr 23 08:17:16.110082 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.110043 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nl62j" event={"ID":"5b7a07bf-2318-478f-9149-ee5a0395ef3f","Type":"ContainerDied","Data":"a37ed84ca9593f5527359420c80362cca55a9c3d53012e9661994f3caf5eaa41"} Apr 23 08:17:16.146758 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.146734 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-c655cf7d5-k7g7j"] Apr 23 08:17:16.151473 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.151454 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.153999 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.153980 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 08:17:16.154092 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.154050 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8v0ipib0buqdg\"" Apr 23 08:17:16.154810 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.154771 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 08:17:16.154810 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.154787 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 08:17:16.154963 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.154817 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 08:17:16.154963 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.154776 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-g4464\"" Apr 23 08:17:16.154963 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.154774 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 08:17:16.161026 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.160958 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c655cf7d5-k7g7j"] Apr 23 08:17:16.289340 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.289288 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-grpc-tls\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.289513 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.289352 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5effe1b4-68bd-4d42-b808-4141bd7e5df5-metrics-client-ca\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.289513 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.289411 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.289513 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.289453 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6k24\" (UniqueName: \"kubernetes.io/projected/5effe1b4-68bd-4d42-b808-4141bd7e5df5-kube-api-access-t6k24\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.289513 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.289493 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.289664 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.289521 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.289664 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.289583 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.289664 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.289601 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-tls\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.390706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.390600 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.390706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.390653 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-tls\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.390706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.390697 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-grpc-tls\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.390969 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.390742 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5effe1b4-68bd-4d42-b808-4141bd7e5df5-metrics-client-ca\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.390969 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.390788 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.390969 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.390837 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6k24\" (UniqueName: \"kubernetes.io/projected/5effe1b4-68bd-4d42-b808-4141bd7e5df5-kube-api-access-t6k24\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.390969 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.390881 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.390969 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.390919 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.391575 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.391550 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5effe1b4-68bd-4d42-b808-4141bd7e5df5-metrics-client-ca\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.394159 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.394113 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.394159 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.394135 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.394350 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.394215 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-tls\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.394498 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.394429 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.394629 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.394605 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.394989 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.394969 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5effe1b4-68bd-4d42-b808-4141bd7e5df5-secret-grpc-tls\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.399336 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.399318 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6k24\" (UniqueName: \"kubernetes.io/projected/5effe1b4-68bd-4d42-b808-4141bd7e5df5-kube-api-access-t6k24\") pod \"thanos-querier-c655cf7d5-k7g7j\" (UID: \"5effe1b4-68bd-4d42-b808-4141bd7e5df5\") " pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.460239 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.460217 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:16.601109 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:16.599886 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c655cf7d5-k7g7j"] Apr 23 08:17:16.604321 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:16.604288 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5effe1b4_68bd_4d42_b808_4141bd7e5df5.slice/crio-619bd5aa8b0d1b145ab99fc40971693bbd8dd227e4af508bd863db03e930d6ad WatchSource:0}: Error finding container 619bd5aa8b0d1b145ab99fc40971693bbd8dd227e4af508bd863db03e930d6ad: Status 404 returned error can't find the container with id 619bd5aa8b0d1b145ab99fc40971693bbd8dd227e4af508bd863db03e930d6ad Apr 23 08:17:17.113842 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:17.113815 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" event={"ID":"5effe1b4-68bd-4d42-b808-4141bd7e5df5","Type":"ContainerStarted","Data":"619bd5aa8b0d1b145ab99fc40971693bbd8dd227e4af508bd863db03e930d6ad"} Apr 23 08:17:17.115105 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:17.115080 2561 generic.go:358] "Generic (PLEG): container finished" podID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerID="2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2" exitCode=0 Apr 23 08:17:17.115224 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:17.115146 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerDied","Data":"2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2"} Apr 23 08:17:17.117101 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:17.117077 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nl62j" event={"ID":"5b7a07bf-2318-478f-9149-ee5a0395ef3f","Type":"ContainerStarted","Data":"cf34fc2c947270cf86c44450ec3644bc69ef5b6c5c8bcca2f22fbad91d79920c"} Apr 23 08:17:17.117202 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:17.117104 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nl62j" event={"ID":"5b7a07bf-2318-478f-9149-ee5a0395ef3f","Type":"ContainerStarted","Data":"465ca42473f3fea8f6c84cb453678e4281175a25d3a424cf5014f8057c38e4d3"} Apr 23 08:17:17.161743 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:17.161701 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nl62j" podStartSLOduration=2.348482198 podStartE2EDuration="3.161690994s" podCreationTimestamp="2026-04-23 08:17:14 +0000 UTC" firstStartedPulling="2026-04-23 08:17:14.440404063 +0000 UTC m=+156.371766844" lastFinishedPulling="2026-04-23 08:17:15.253612861 +0000 UTC m=+157.184975640" observedRunningTime="2026-04-23 08:17:17.161205641 +0000 UTC m=+159.092568441" watchObservedRunningTime="2026-04-23 08:17:17.161690994 +0000 UTC m=+159.093053795" Apr 23 08:17:19.126391 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.126357 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerStarted","Data":"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b"} Apr 23 08:17:19.126391 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.126397 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerStarted","Data":"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb"} Apr 23 08:17:19.126757 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.126407 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerStarted","Data":"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb"} Apr 23 08:17:19.126757 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.126415 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerStarted","Data":"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6"} Apr 23 08:17:19.126757 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.126423 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerStarted","Data":"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08"} Apr 23 08:17:19.323312 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.323277 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-84c6475f8d-fqb5d"] Apr 23 08:17:19.326661 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.326645 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.329465 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.329430 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 08:17:19.329465 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.329447 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 08:17:19.329650 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.329501 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 08:17:19.329650 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.329523 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-mg6fh\"" Apr 23 08:17:19.329650 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.329430 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 08:17:19.329650 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.329533 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 08:17:19.334083 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.334065 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 08:17:19.340294 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.340252 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-84c6475f8d-fqb5d"] Apr 23 08:17:19.419096 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419067 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-metrics-client-ca\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.419257 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419102 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-serving-certs-ca-bundle\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.419257 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419132 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-secret-telemeter-client\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.419257 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419156 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5wsx\" (UniqueName: \"kubernetes.io/projected/4db818a7-277d-4827-bfd7-b70afd3dbbe4-kube-api-access-w5wsx\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.419257 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419234 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:17:19.419485 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419275 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-telemeter-client-tls\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.419485 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419302 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.419485 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419327 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:17:19.419485 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419370 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-federate-client-tls\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.419485 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.419401 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.421519 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.421495 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3ada2676-04c4-4126-a943-cd1d167949aa-metrics-tls\") pod \"dns-default-46nvx\" (UID: \"3ada2676-04c4-4126-a943-cd1d167949aa\") " pod="openshift-dns/dns-default-46nvx" Apr 23 08:17:19.421614 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.421562 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7c0ad21-b2af-4a80-a79c-000cff3a91ab-cert\") pod \"ingress-canary-ttph8\" (UID: \"c7c0ad21-b2af-4a80-a79c-000cff3a91ab\") " pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:17:19.520776 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.520738 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-metrics-client-ca\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.520955 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.520790 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-serving-certs-ca-bundle\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.520955 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.520832 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-secret-telemeter-client\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.520955 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.520859 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5wsx\" (UniqueName: \"kubernetes.io/projected/4db818a7-277d-4827-bfd7-b70afd3dbbe4-kube-api-access-w5wsx\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.520955 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.520924 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-telemeter-client-tls\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.521180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.520969 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.521180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.521030 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-federate-client-tls\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.521180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.521055 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.521662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.521635 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-serving-certs-ca-bundle\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.521771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.521707 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-metrics-client-ca\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.521995 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.521974 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db818a7-277d-4827-bfd7-b70afd3dbbe4-telemeter-trusted-ca-bundle\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.523568 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.523547 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-telemeter-client-tls\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.523743 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.523724 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.523938 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.523917 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-federate-client-tls\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.523978 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.523917 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4db818a7-277d-4827-bfd7-b70afd3dbbe4-secret-telemeter-client\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.528760 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.528741 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5wsx\" (UniqueName: \"kubernetes.io/projected/4db818a7-277d-4827-bfd7-b70afd3dbbe4-kube-api-access-w5wsx\") pod \"telemeter-client-84c6475f8d-fqb5d\" (UID: \"4db818a7-277d-4827-bfd7-b70afd3dbbe4\") " pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.609091 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.609012 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkwh7\"" Apr 23 08:17:19.609252 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.609094 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f2cmm\"" Apr 23 08:17:19.617181 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.617164 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ttph8" Apr 23 08:17:19.617246 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.617195 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-46nvx" Apr 23 08:17:19.635809 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.635784 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" Apr 23 08:17:19.785375 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.785278 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ttph8"] Apr 23 08:17:19.812919 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.812871 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-46nvx"] Apr 23 08:17:19.816320 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:19.816291 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ada2676_04c4_4126_a943_cd1d167949aa.slice/crio-6cfbd19355bec1beb626e5fcd71dbbad1647362f507461f1c3afd4d4a89e3fe7 WatchSource:0}: Error finding container 6cfbd19355bec1beb626e5fcd71dbbad1647362f507461f1c3afd4d4a89e3fe7: Status 404 returned error can't find the container with id 6cfbd19355bec1beb626e5fcd71dbbad1647362f507461f1c3afd4d4a89e3fe7 Apr 23 08:17:19.844505 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:19.844257 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-84c6475f8d-fqb5d"] Apr 23 08:17:19.846880 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:19.846852 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4db818a7_277d_4827_bfd7_b70afd3dbbe4.slice/crio-40b1fe43a8aebc29fd466569c67efc2319eefce2197c61bf07c62f159b96af70 WatchSource:0}: Error finding container 40b1fe43a8aebc29fd466569c67efc2319eefce2197c61bf07c62f159b96af70: Status 404 returned error can't find the container with id 40b1fe43a8aebc29fd466569c67efc2319eefce2197c61bf07c62f159b96af70 Apr 23 08:17:20.130521 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.130480 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ttph8" event={"ID":"c7c0ad21-b2af-4a80-a79c-000cff3a91ab","Type":"ContainerStarted","Data":"b1904df0472f5862d5177c7cd28a3b8993e6d79976e88e9858ad55bc5e345715"} Apr 23 08:17:20.131578 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.131547 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-46nvx" event={"ID":"3ada2676-04c4-4126-a943-cd1d167949aa","Type":"ContainerStarted","Data":"6cfbd19355bec1beb626e5fcd71dbbad1647362f507461f1c3afd4d4a89e3fe7"} Apr 23 08:17:20.132689 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.132657 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" event={"ID":"4db818a7-277d-4827-bfd7-b70afd3dbbe4","Type":"ContainerStarted","Data":"40b1fe43a8aebc29fd466569c67efc2319eefce2197c61bf07c62f159b96af70"} Apr 23 08:17:20.421070 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.420994 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65d9cf4bd-7rckb"] Apr 23 08:17:20.424589 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.424570 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.427187 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.427164 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xd4rh\"" Apr 23 08:17:20.427187 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.427180 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:17:20.427355 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.427314 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:17:20.428605 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.428380 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:17:20.428605 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.428448 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:17:20.428605 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.428522 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:17:20.428605 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.428531 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:17:20.428605 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.428594 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:17:20.433624 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.433153 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 08:17:20.435217 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.435200 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d9cf4bd-7rckb"] Apr 23 08:17:20.530089 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.530058 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-console-config\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.530089 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.530095 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-service-ca\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.530314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.530119 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wktlm\" (UniqueName: \"kubernetes.io/projected/411dd30e-d871-429c-a929-ddfc5abddc5a-kube-api-access-wktlm\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.530314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.530177 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-oauth-serving-cert\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.530314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.530221 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-oauth-config\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.530314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.530252 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-trusted-ca-bundle\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.530477 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.530341 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-serving-cert\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.631861 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.631831 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-serving-cert\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.631980 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.631906 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-console-config\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.631980 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.631937 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-service-ca\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.631980 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.631974 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wktlm\" (UniqueName: \"kubernetes.io/projected/411dd30e-d871-429c-a929-ddfc5abddc5a-kube-api-access-wktlm\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.632143 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.632001 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-oauth-serving-cert\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.632143 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.632034 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-oauth-config\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.632143 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.632060 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-trusted-ca-bundle\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.633570 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.632912 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-trusted-ca-bundle\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.633570 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.633070 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-console-config\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.633570 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.633518 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-oauth-serving-cert\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.633860 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.633722 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-service-ca\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.635799 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.635776 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-serving-cert\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.637174 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.637126 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-oauth-config\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.641505 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.641480 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wktlm\" (UniqueName: \"kubernetes.io/projected/411dd30e-d871-429c-a929-ddfc5abddc5a-kube-api-access-wktlm\") pod \"console-65d9cf4bd-7rckb\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.736885 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.736503 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:20.909040 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:20.909008 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d9cf4bd-7rckb"] Apr 23 08:17:20.912779 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:20.912752 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod411dd30e_d871_429c_a929_ddfc5abddc5a.slice/crio-a4b7a794cefd627ae19ee0c76efc78df1740828aa1fda9a1dcb982a4b1deff1c WatchSource:0}: Error finding container a4b7a794cefd627ae19ee0c76efc78df1740828aa1fda9a1dcb982a4b1deff1c: Status 404 returned error can't find the container with id a4b7a794cefd627ae19ee0c76efc78df1740828aa1fda9a1dcb982a4b1deff1c Apr 23 08:17:21.140208 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.140085 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" event={"ID":"5effe1b4-68bd-4d42-b808-4141bd7e5df5","Type":"ContainerStarted","Data":"b73bbba5e0aeb96c37a0334e2885aaeed098b637ade6e856a036e0e907f92802"} Apr 23 08:17:21.140208 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.140129 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" event={"ID":"5effe1b4-68bd-4d42-b808-4141bd7e5df5","Type":"ContainerStarted","Data":"24a767a42ae8e34491cbc518c3a361d8bb047eca53bc88db21136696f03ca771"} Apr 23 08:17:21.140208 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.140146 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" event={"ID":"5effe1b4-68bd-4d42-b808-4141bd7e5df5","Type":"ContainerStarted","Data":"f70fa5b05bde875e632661fd647e804ab57e181c1088f8510cfc440ef52420fb"} Apr 23 08:17:21.140208 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.140159 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" event={"ID":"5effe1b4-68bd-4d42-b808-4141bd7e5df5","Type":"ContainerStarted","Data":"069c2f0bd41f54e3cc7ca570e50cf038e72a1837f04d88f5d68ef87525209597"} Apr 23 08:17:21.140208 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.140171 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" event={"ID":"5effe1b4-68bd-4d42-b808-4141bd7e5df5","Type":"ContainerStarted","Data":"d7f064da2e9711e310f097bfa7e796ffdc07c555de79943888926eab0684ab1e"} Apr 23 08:17:21.140208 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.140183 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" event={"ID":"5effe1b4-68bd-4d42-b808-4141bd7e5df5","Type":"ContainerStarted","Data":"4c12e37f13b35e8e6bbac57a54c5c2d38a6a9cc8f8b01d3f371972623eb397d9"} Apr 23 08:17:21.140924 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.140319 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:21.144227 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.144180 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerStarted","Data":"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83"} Apr 23 08:17:21.145702 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.145617 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d9cf4bd-7rckb" event={"ID":"411dd30e-d871-429c-a929-ddfc5abddc5a","Type":"ContainerStarted","Data":"a4b7a794cefd627ae19ee0c76efc78df1740828aa1fda9a1dcb982a4b1deff1c"} Apr 23 08:17:21.163904 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.163858 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" podStartSLOduration=1.2292290160000001 podStartE2EDuration="5.16384272s" podCreationTimestamp="2026-04-23 08:17:16 +0000 UTC" firstStartedPulling="2026-04-23 08:17:16.606453101 +0000 UTC m=+158.537815892" lastFinishedPulling="2026-04-23 08:17:20.541066812 +0000 UTC m=+162.472429596" observedRunningTime="2026-04-23 08:17:21.163001389 +0000 UTC m=+163.094364214" watchObservedRunningTime="2026-04-23 08:17:21.16384272 +0000 UTC m=+163.095205532" Apr 23 08:17:21.188040 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:21.187932 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.23681724 podStartE2EDuration="6.18791757s" podCreationTimestamp="2026-04-23 08:17:15 +0000 UTC" firstStartedPulling="2026-04-23 08:17:15.589959177 +0000 UTC m=+157.521321967" lastFinishedPulling="2026-04-23 08:17:20.541059518 +0000 UTC m=+162.472422297" observedRunningTime="2026-04-23 08:17:21.18636183 +0000 UTC m=+163.117724634" watchObservedRunningTime="2026-04-23 08:17:21.18791757 +0000 UTC m=+163.119280374" Apr 23 08:17:24.164331 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:24.164247 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d9cf4bd-7rckb" event={"ID":"411dd30e-d871-429c-a929-ddfc5abddc5a","Type":"ContainerStarted","Data":"5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e"} Apr 23 08:17:24.170539 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:24.170457 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ttph8" event={"ID":"c7c0ad21-b2af-4a80-a79c-000cff3a91ab","Type":"ContainerStarted","Data":"839ec9e38f108241ab66c42be81ad1b5ccfca0c77789ca8e67dbbbdf231bd0e0"} Apr 23 08:17:24.175858 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:24.175831 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-46nvx" event={"ID":"3ada2676-04c4-4126-a943-cd1d167949aa","Type":"ContainerStarted","Data":"136ef7a6277f9785832585011e23aa0945eac9c0d185018df6c691f6f042cf5b"} Apr 23 08:17:24.178944 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:24.178923 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" event={"ID":"4db818a7-277d-4827-bfd7-b70afd3dbbe4","Type":"ContainerStarted","Data":"97f51c3a45dd2bae5da93d8d45db6535bd3bc5303a65dfc4ba4ac6cd39d928ac"} Apr 23 08:17:24.182851 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:24.182797 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65d9cf4bd-7rckb" podStartSLOduration=1.081318606 podStartE2EDuration="4.182783475s" podCreationTimestamp="2026-04-23 08:17:20 +0000 UTC" firstStartedPulling="2026-04-23 08:17:20.916177904 +0000 UTC m=+162.847540688" lastFinishedPulling="2026-04-23 08:17:24.01764276 +0000 UTC m=+165.949005557" observedRunningTime="2026-04-23 08:17:24.182193875 +0000 UTC m=+166.113556688" watchObservedRunningTime="2026-04-23 08:17:24.182783475 +0000 UTC m=+166.114146277" Apr 23 08:17:24.205299 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:24.204916 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ttph8" podStartSLOduration=129.029721512 podStartE2EDuration="2m13.204896227s" podCreationTimestamp="2026-04-23 08:15:11 +0000 UTC" firstStartedPulling="2026-04-23 08:17:19.793414141 +0000 UTC m=+161.724776933" lastFinishedPulling="2026-04-23 08:17:23.968588857 +0000 UTC m=+165.899951648" observedRunningTime="2026-04-23 08:17:24.203520376 +0000 UTC m=+166.134883178" watchObservedRunningTime="2026-04-23 08:17:24.204896227 +0000 UTC m=+166.136259028" Apr 23 08:17:25.184310 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:25.184275 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-46nvx" event={"ID":"3ada2676-04c4-4126-a943-cd1d167949aa","Type":"ContainerStarted","Data":"9f9352d28cea658756132afe3ce6c726eb3e02d88fbdcba76f200e35f5ced53d"} Apr 23 08:17:25.184732 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:25.184418 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-46nvx" Apr 23 08:17:25.186167 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:25.186144 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" event={"ID":"4db818a7-277d-4827-bfd7-b70afd3dbbe4","Type":"ContainerStarted","Data":"0dd536589010c779c4ec9237a443533e2469d7fb04fe01e7561a9803deac71d5"} Apr 23 08:17:25.186281 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:25.186168 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" event={"ID":"4db818a7-277d-4827-bfd7-b70afd3dbbe4","Type":"ContainerStarted","Data":"9c9ff367f1ef55d2706bad20328774738f4e9b56f95b293af40d8602931217ec"} Apr 23 08:17:25.202505 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:25.202459 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-46nvx" podStartSLOduration=130.055062333 podStartE2EDuration="2m14.202448563s" podCreationTimestamp="2026-04-23 08:15:11 +0000 UTC" firstStartedPulling="2026-04-23 08:17:19.819065333 +0000 UTC m=+161.750428124" lastFinishedPulling="2026-04-23 08:17:23.966451573 +0000 UTC m=+165.897814354" observedRunningTime="2026-04-23 08:17:25.200856368 +0000 UTC m=+167.132219205" watchObservedRunningTime="2026-04-23 08:17:25.202448563 +0000 UTC m=+167.133811409" Apr 23 08:17:25.222907 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:25.222867 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-84c6475f8d-fqb5d" podStartSLOduration=2.099427131 podStartE2EDuration="6.222856847s" podCreationTimestamp="2026-04-23 08:17:19 +0000 UTC" firstStartedPulling="2026-04-23 08:17:19.84911432 +0000 UTC m=+161.780477114" lastFinishedPulling="2026-04-23 08:17:23.972544046 +0000 UTC m=+165.903906830" observedRunningTime="2026-04-23 08:17:25.220620911 +0000 UTC m=+167.151983712" watchObservedRunningTime="2026-04-23 08:17:25.222856847 +0000 UTC m=+167.154219647" Apr 23 08:17:27.156295 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.156253 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-c655cf7d5-k7g7j" Apr 23 08:17:27.169908 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.169881 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8f5ff9678-d7ctm"] Apr 23 08:17:27.175416 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.175393 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.184703 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.184678 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8f5ff9678-d7ctm"] Apr 23 08:17:27.295150 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.295116 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-serving-cert\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.295321 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.295165 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drhls\" (UniqueName: \"kubernetes.io/projected/5f75f47e-3806-4d72-88e2-7b0d59316df6-kube-api-access-drhls\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.295321 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.295273 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-oauth-config\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.295436 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.295323 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-trusted-ca-bundle\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.295436 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.295367 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-service-ca\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.295436 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.295403 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-oauth-serving-cert\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.295625 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.295602 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-config\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.396483 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.396447 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-oauth-config\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.396483 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.396483 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-trusted-ca-bundle\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.396726 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.396518 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-service-ca\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.396726 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.396548 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-oauth-serving-cert\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.396726 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.396598 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-config\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.396726 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.396658 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-serving-cert\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.396726 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.396688 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drhls\" (UniqueName: \"kubernetes.io/projected/5f75f47e-3806-4d72-88e2-7b0d59316df6-kube-api-access-drhls\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.397446 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.397415 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-service-ca\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.397570 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.397477 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-config\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.397570 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.397483 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-trusted-ca-bundle\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.397570 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.397507 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-oauth-serving-cert\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.398967 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.398942 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-serving-cert\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.399080 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.398986 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-oauth-config\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.408342 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.408290 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drhls\" (UniqueName: \"kubernetes.io/projected/5f75f47e-3806-4d72-88e2-7b0d59316df6-kube-api-access-drhls\") pod \"console-8f5ff9678-d7ctm\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.485483 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.485450 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:27.603907 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:27.603882 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8f5ff9678-d7ctm"] Apr 23 08:17:27.605974 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:17:27.605937 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f75f47e_3806_4d72_88e2_7b0d59316df6.slice/crio-5dced97f5721cf82dcee355b76a012be3fc869016506b44d9cc6a6f5e3488830 WatchSource:0}: Error finding container 5dced97f5721cf82dcee355b76a012be3fc869016506b44d9cc6a6f5e3488830: Status 404 returned error can't find the container with id 5dced97f5721cf82dcee355b76a012be3fc869016506b44d9cc6a6f5e3488830 Apr 23 08:17:28.198736 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:28.198702 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8f5ff9678-d7ctm" event={"ID":"5f75f47e-3806-4d72-88e2-7b0d59316df6","Type":"ContainerStarted","Data":"88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131"} Apr 23 08:17:28.198736 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:28.198737 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8f5ff9678-d7ctm" event={"ID":"5f75f47e-3806-4d72-88e2-7b0d59316df6","Type":"ContainerStarted","Data":"5dced97f5721cf82dcee355b76a012be3fc869016506b44d9cc6a6f5e3488830"} Apr 23 08:17:28.219021 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:28.218976 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8f5ff9678-d7ctm" podStartSLOduration=1.218962948 podStartE2EDuration="1.218962948s" podCreationTimestamp="2026-04-23 08:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:17:28.217829138 +0000 UTC m=+170.149191938" watchObservedRunningTime="2026-04-23 08:17:28.218962948 +0000 UTC m=+170.150325749" Apr 23 08:17:30.660027 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:30.659990 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:17:30.737377 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:30.737342 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:30.737377 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:30.737384 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:30.742033 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:30.742013 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:31.218588 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:31.218562 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:17:35.191401 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:35.191370 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-46nvx" Apr 23 08:17:37.486514 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:37.486469 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:37.486902 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:37.486528 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:37.491283 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:37.491239 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:38.245285 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:38.245239 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:17:38.291753 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:17:38.291726 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65d9cf4bd-7rckb"] Apr 23 08:18:00.303094 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:00.303066 2561 generic.go:358] "Generic (PLEG): container finished" podID="2cdf9e60-6a76-44c7-a819-39654b29c96a" containerID="c759433d42fa8e34fbe8df17aa7aa80694928d0341be963683f6775013fbabd8" exitCode=0 Apr 23 08:18:00.303436 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:00.303102 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-km78h" event={"ID":"2cdf9e60-6a76-44c7-a819-39654b29c96a","Type":"ContainerDied","Data":"c759433d42fa8e34fbe8df17aa7aa80694928d0341be963683f6775013fbabd8"} Apr 23 08:18:00.303436 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:00.303404 2561 scope.go:117] "RemoveContainer" containerID="c759433d42fa8e34fbe8df17aa7aa80694928d0341be963683f6775013fbabd8" Apr 23 08:18:01.309553 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:01.309520 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-km78h" event={"ID":"2cdf9e60-6a76-44c7-a819-39654b29c96a","Type":"ContainerStarted","Data":"b38d22d845b94dd67a17b555cbc360e99f17bf0c5f46e84774a9006efd715156"} Apr 23 08:18:02.313400 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:02.313367 2561 generic.go:358] "Generic (PLEG): container finished" podID="a42ca4f9-a9ae-4413-9c3d-fa18098d565a" containerID="727a231f06d78f1084fd3931bef25e3060bc82255f37e871aee9d11419881742" exitCode=0 Apr 23 08:18:02.313945 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:02.313441 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" event={"ID":"a42ca4f9-a9ae-4413-9c3d-fa18098d565a","Type":"ContainerDied","Data":"727a231f06d78f1084fd3931bef25e3060bc82255f37e871aee9d11419881742"} Apr 23 08:18:02.313945 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:02.313798 2561 scope.go:117] "RemoveContainer" containerID="727a231f06d78f1084fd3931bef25e3060bc82255f37e871aee9d11419881742" Apr 23 08:18:02.314933 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:02.314911 2561 generic.go:358] "Generic (PLEG): container finished" podID="d38a2fb0-c776-4a02-95f3-1c68963e1ef7" containerID="4bab73a84a6d6da982c3daf6b66d66b56a070cc1ec4de0e6c51a3f121bb0b1f4" exitCode=0 Apr 23 08:18:02.315024 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:02.314943 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" event={"ID":"d38a2fb0-c776-4a02-95f3-1c68963e1ef7","Type":"ContainerDied","Data":"4bab73a84a6d6da982c3daf6b66d66b56a070cc1ec4de0e6c51a3f121bb0b1f4"} Apr 23 08:18:02.315255 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:02.315221 2561 scope.go:117] "RemoveContainer" containerID="4bab73a84a6d6da982c3daf6b66d66b56a070cc1ec4de0e6c51a3f121bb0b1f4" Apr 23 08:18:03.315272 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.315174 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65d9cf4bd-7rckb" podUID="411dd30e-d871-429c-a929-ddfc5abddc5a" containerName="console" containerID="cri-o://5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e" gracePeriod=15 Apr 23 08:18:03.319133 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.319102 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zgcjm" event={"ID":"d38a2fb0-c776-4a02-95f3-1c68963e1ef7","Type":"ContainerStarted","Data":"078228329f59bf50699657eadaa139d41a300d7e3421b0f4efa6ed066c832005"} Apr 23 08:18:03.321019 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.320733 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6qc5q" event={"ID":"a42ca4f9-a9ae-4413-9c3d-fa18098d565a","Type":"ContainerStarted","Data":"83c0ba06fb3c6a878bfcf36eab43736231c13539ee818c3f91bec7af1bbb391d"} Apr 23 08:18:03.553746 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.553723 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d9cf4bd-7rckb_411dd30e-d871-429c-a929-ddfc5abddc5a/console/0.log" Apr 23 08:18:03.553856 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.553795 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:18:03.613351 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613252 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wktlm\" (UniqueName: \"kubernetes.io/projected/411dd30e-d871-429c-a929-ddfc5abddc5a-kube-api-access-wktlm\") pod \"411dd30e-d871-429c-a929-ddfc5abddc5a\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " Apr 23 08:18:03.613482 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613386 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-service-ca\") pod \"411dd30e-d871-429c-a929-ddfc5abddc5a\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " Apr 23 08:18:03.613482 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613407 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-oauth-config\") pod \"411dd30e-d871-429c-a929-ddfc5abddc5a\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " Apr 23 08:18:03.613482 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613433 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-console-config\") pod \"411dd30e-d871-429c-a929-ddfc5abddc5a\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " Apr 23 08:18:03.613482 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613467 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-trusted-ca-bundle\") pod \"411dd30e-d871-429c-a929-ddfc5abddc5a\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " Apr 23 08:18:03.613656 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613498 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-serving-cert\") pod \"411dd30e-d871-429c-a929-ddfc5abddc5a\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " Apr 23 08:18:03.613656 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613527 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-oauth-serving-cert\") pod \"411dd30e-d871-429c-a929-ddfc5abddc5a\" (UID: \"411dd30e-d871-429c-a929-ddfc5abddc5a\") " Apr 23 08:18:03.613884 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613854 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-service-ca" (OuterVolumeSpecName: "service-ca") pod "411dd30e-d871-429c-a929-ddfc5abddc5a" (UID: "411dd30e-d871-429c-a929-ddfc5abddc5a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:03.613979 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613874 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-console-config" (OuterVolumeSpecName: "console-config") pod "411dd30e-d871-429c-a929-ddfc5abddc5a" (UID: "411dd30e-d871-429c-a929-ddfc5abddc5a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:03.614042 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.613984 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "411dd30e-d871-429c-a929-ddfc5abddc5a" (UID: "411dd30e-d871-429c-a929-ddfc5abddc5a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:03.614042 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.614027 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "411dd30e-d871-429c-a929-ddfc5abddc5a" (UID: "411dd30e-d871-429c-a929-ddfc5abddc5a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:03.615628 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.615609 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "411dd30e-d871-429c-a929-ddfc5abddc5a" (UID: "411dd30e-d871-429c-a929-ddfc5abddc5a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:03.615828 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.615809 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411dd30e-d871-429c-a929-ddfc5abddc5a-kube-api-access-wktlm" (OuterVolumeSpecName: "kube-api-access-wktlm") pod "411dd30e-d871-429c-a929-ddfc5abddc5a" (UID: "411dd30e-d871-429c-a929-ddfc5abddc5a"). InnerVolumeSpecName "kube-api-access-wktlm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:18:03.615890 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.615805 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "411dd30e-d871-429c-a929-ddfc5abddc5a" (UID: "411dd30e-d871-429c-a929-ddfc5abddc5a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:03.714383 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.714337 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-service-ca\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:03.714383 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.714378 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-oauth-config\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:03.714383 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.714394 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-console-config\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:03.714624 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.714409 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-trusted-ca-bundle\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:03.714624 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.714421 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/411dd30e-d871-429c-a929-ddfc5abddc5a-console-serving-cert\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:03.714624 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.714434 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/411dd30e-d871-429c-a929-ddfc5abddc5a-oauth-serving-cert\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:03.714624 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:03.714446 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wktlm\" (UniqueName: \"kubernetes.io/projected/411dd30e-d871-429c-a929-ddfc5abddc5a-kube-api-access-wktlm\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:04.324611 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.324585 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d9cf4bd-7rckb_411dd30e-d871-429c-a929-ddfc5abddc5a/console/0.log" Apr 23 08:18:04.325032 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.324624 2561 generic.go:358] "Generic (PLEG): container finished" podID="411dd30e-d871-429c-a929-ddfc5abddc5a" containerID="5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e" exitCode=2 Apr 23 08:18:04.325032 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.324671 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d9cf4bd-7rckb" event={"ID":"411dd30e-d871-429c-a929-ddfc5abddc5a","Type":"ContainerDied","Data":"5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e"} Apr 23 08:18:04.325032 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.324684 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d9cf4bd-7rckb" Apr 23 08:18:04.325032 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.324693 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d9cf4bd-7rckb" event={"ID":"411dd30e-d871-429c-a929-ddfc5abddc5a","Type":"ContainerDied","Data":"a4b7a794cefd627ae19ee0c76efc78df1740828aa1fda9a1dcb982a4b1deff1c"} Apr 23 08:18:04.325032 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.324708 2561 scope.go:117] "RemoveContainer" containerID="5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e" Apr 23 08:18:04.332999 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.332978 2561 scope.go:117] "RemoveContainer" containerID="5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e" Apr 23 08:18:04.333244 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:18:04.333224 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e\": container with ID starting with 5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e not found: ID does not exist" containerID="5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e" Apr 23 08:18:04.333338 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.333257 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e"} err="failed to get container status \"5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e\": rpc error: code = NotFound desc = could not find container \"5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e\": container with ID starting with 5b143650265c06aeeaf2ee0e0fd1ed4946bd2117a64fa00a639f19ee2e96803e not found: ID does not exist" Apr 23 08:18:04.347198 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.347171 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65d9cf4bd-7rckb"] Apr 23 08:18:04.351663 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.351643 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65d9cf4bd-7rckb"] Apr 23 08:18:04.664088 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:04.664012 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411dd30e-d871-429c-a929-ddfc5abddc5a" path="/var/lib/kubelet/pods/411dd30e-d871-429c-a929-ddfc5abddc5a/volumes" Apr 23 08:18:34.208721 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.208684 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cf9f45756-jcjns"] Apr 23 08:18:34.209231 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.209011 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="411dd30e-d871-429c-a929-ddfc5abddc5a" containerName="console" Apr 23 08:18:34.209231 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.209023 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="411dd30e-d871-429c-a929-ddfc5abddc5a" containerName="console" Apr 23 08:18:34.209231 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.209091 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="411dd30e-d871-429c-a929-ddfc5abddc5a" containerName="console" Apr 23 08:18:34.212096 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.212073 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.229571 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.229545 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf9f45756-jcjns"] Apr 23 08:18:34.265733 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.265708 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-oauth-serving-cert\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.265873 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.265757 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-service-ca\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.265873 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.265807 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-config\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.265873 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.265840 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-oauth-config\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.265873 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.265869 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-trusted-ca-bundle\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.266079 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.265942 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwftg\" (UniqueName: \"kubernetes.io/projected/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-kube-api-access-xwftg\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.266079 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.265984 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-serving-cert\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.367131 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367096 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-oauth-serving-cert\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.367298 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367146 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-service-ca\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.367298 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367173 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-config\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.367298 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367197 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-oauth-config\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.367298 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367224 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-trusted-ca-bundle\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.367519 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367312 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwftg\" (UniqueName: \"kubernetes.io/projected/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-kube-api-access-xwftg\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.367519 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367359 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-serving-cert\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.367909 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367874 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-service-ca\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.368011 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367910 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-oauth-serving-cert\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.368011 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.367946 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-config\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.368155 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.368135 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-trusted-ca-bundle\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.369686 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.369665 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-oauth-config\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.369779 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.369765 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-serving-cert\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.375226 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.375204 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwftg\" (UniqueName: \"kubernetes.io/projected/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-kube-api-access-xwftg\") pod \"console-7cf9f45756-jcjns\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.460322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.460248 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:18:34.460687 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.460649 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="alertmanager" containerID="cri-o://d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08" gracePeriod=120 Apr 23 08:18:34.460761 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.460732 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy-web" containerID="cri-o://a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb" gracePeriod=120 Apr 23 08:18:34.460761 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.460747 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="config-reloader" containerID="cri-o://02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6" gracePeriod=120 Apr 23 08:18:34.460879 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.460767 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy" containerID="cri-o://ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb" gracePeriod=120 Apr 23 08:18:34.460879 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.460773 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="prom-label-proxy" containerID="cri-o://2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83" gracePeriod=120 Apr 23 08:18:34.460879 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.460724 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy-metric" containerID="cri-o://cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b" gracePeriod=120 Apr 23 08:18:34.520964 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.520940 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:34.645099 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:34.645075 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf9f45756-jcjns"] Apr 23 08:18:34.647476 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:18:34.647448 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef3b1a8d_f3a4_42e7_bda5_a50593cac0fd.slice/crio-c20b9f40f80b779fe021ea89d5f1fc210ab790f9875f75a53a02ef2b839690e4 WatchSource:0}: Error finding container c20b9f40f80b779fe021ea89d5f1fc210ab790f9875f75a53a02ef2b839690e4: Status 404 returned error can't find the container with id c20b9f40f80b779fe021ea89d5f1fc210ab790f9875f75a53a02ef2b839690e4 Apr 23 08:18:35.420336 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.420293 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf9f45756-jcjns" event={"ID":"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd","Type":"ContainerStarted","Data":"a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb"} Apr 23 08:18:35.420740 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.420350 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf9f45756-jcjns" event={"ID":"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd","Type":"ContainerStarted","Data":"c20b9f40f80b779fe021ea89d5f1fc210ab790f9875f75a53a02ef2b839690e4"} Apr 23 08:18:35.428074 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.428048 2561 generic.go:358] "Generic (PLEG): container finished" podID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerID="2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83" exitCode=0 Apr 23 08:18:35.428074 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.428067 2561 generic.go:358] "Generic (PLEG): container finished" podID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerID="ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb" exitCode=0 Apr 23 08:18:35.428074 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.428073 2561 generic.go:358] "Generic (PLEG): container finished" podID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerID="02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6" exitCode=0 Apr 23 08:18:35.428074 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.428079 2561 generic.go:358] "Generic (PLEG): container finished" podID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerID="d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08" exitCode=0 Apr 23 08:18:35.428314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.428102 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerDied","Data":"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83"} Apr 23 08:18:35.428314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.428121 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerDied","Data":"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb"} Apr 23 08:18:35.428314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.428132 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerDied","Data":"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6"} Apr 23 08:18:35.428314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.428140 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerDied","Data":"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08"} Apr 23 08:18:35.441046 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.441005 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cf9f45756-jcjns" podStartSLOduration=1.440993173 podStartE2EDuration="1.440993173s" podCreationTimestamp="2026-04-23 08:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:18:35.439897352 +0000 UTC m=+237.371260177" watchObservedRunningTime="2026-04-23 08:18:35.440993173 +0000 UTC m=+237.372355974" Apr 23 08:18:35.708924 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.708900 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:35.777030 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.776979 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-trusted-ca-bundle\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777030 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777025 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-web\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777249 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777052 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-config-volume\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777249 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777232 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-web-config\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777381 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777318 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777381 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777355 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777486 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777403 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-metrics-client-ca\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777486 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777436 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-cluster-tls-config\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777486 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777440 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:35.777486 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777480 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-main-db\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777679 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777518 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64ngt\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-kube-api-access-64ngt\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777679 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777551 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-config-out\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777679 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777585 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-tls-assets\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777679 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777609 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-main-tls\") pod \"fb15804e-aa87-4672-93ff-da7e97217b1f\" (UID: \"fb15804e-aa87-4672-93ff-da7e97217b1f\") " Apr 23 08:18:35.777896 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777857 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:18:35.777949 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777917 2561 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.777949 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.777937 2561 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb15804e-aa87-4672-93ff-da7e97217b1f-metrics-client-ca\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.778851 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.778560 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:18:35.780340 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.780299 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:35.780458 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.780391 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:35.780518 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.780469 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:35.780705 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.780661 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:18:35.781056 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.781025 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:35.781690 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.781662 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-kube-api-access-64ngt" (OuterVolumeSpecName: "kube-api-access-64ngt") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "kube-api-access-64ngt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:18:35.782052 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.782027 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:35.782246 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.782223 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-config-out" (OuterVolumeSpecName: "config-out") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:18:35.806495 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.806459 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:35.812636 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.812612 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-web-config" (OuterVolumeSpecName: "web-config") pod "fb15804e-aa87-4672-93ff-da7e97217b1f" (UID: "fb15804e-aa87-4672-93ff-da7e97217b1f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:18:35.879130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879105 2561 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879129 2561 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879140 2561 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-cluster-tls-config\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879150 2561 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-alertmanager-main-db\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879158 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-64ngt\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-kube-api-access-64ngt\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879168 2561 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb15804e-aa87-4672-93ff-da7e97217b1f-config-out\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879176 2561 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb15804e-aa87-4672-93ff-da7e97217b1f-tls-assets\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879186 2561 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-main-tls\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879194 2561 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879203 2561 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-config-volume\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:35.879322 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:35.879211 2561 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb15804e-aa87-4672-93ff-da7e97217b1f-web-config\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:18:36.433564 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.433527 2561 generic.go:358] "Generic (PLEG): container finished" podID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerID="cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b" exitCode=0 Apr 23 08:18:36.433564 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.433557 2561 generic.go:358] "Generic (PLEG): container finished" podID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerID="a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb" exitCode=0 Apr 23 08:18:36.433986 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.433610 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerDied","Data":"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b"} Apr 23 08:18:36.433986 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.433652 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerDied","Data":"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb"} Apr 23 08:18:36.433986 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.433653 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.433986 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.433663 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fb15804e-aa87-4672-93ff-da7e97217b1f","Type":"ContainerDied","Data":"b5a2489d82692a94602a675235188e5c8115e2e8a7b10b135cf6a304eb7e4ab2"} Apr 23 08:18:36.433986 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.433678 2561 scope.go:117] "RemoveContainer" containerID="2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83" Apr 23 08:18:36.441823 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.441806 2561 scope.go:117] "RemoveContainer" containerID="cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b" Apr 23 08:18:36.448532 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.448513 2561 scope.go:117] "RemoveContainer" containerID="ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb" Apr 23 08:18:36.454645 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.454632 2561 scope.go:117] "RemoveContainer" containerID="a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb" Apr 23 08:18:36.457612 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.457590 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:18:36.460993 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.460970 2561 scope.go:117] "RemoveContainer" containerID="02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6" Apr 23 08:18:36.467406 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.467391 2561 scope.go:117] "RemoveContainer" containerID="d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08" Apr 23 08:18:36.467674 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.467648 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:18:36.473609 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.473592 2561 scope.go:117] "RemoveContainer" containerID="2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2" Apr 23 08:18:36.479449 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.479435 2561 scope.go:117] "RemoveContainer" containerID="2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83" Apr 23 08:18:36.479725 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:18:36.479699 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83\": container with ID starting with 2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83 not found: ID does not exist" containerID="2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83" Apr 23 08:18:36.479808 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.479723 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83"} err="failed to get container status \"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83\": rpc error: code = NotFound desc = could not find container \"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83\": container with ID starting with 2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83 not found: ID does not exist" Apr 23 08:18:36.479808 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.479741 2561 scope.go:117] "RemoveContainer" containerID="cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b" Apr 23 08:18:36.479950 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:18:36.479932 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b\": container with ID starting with cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b not found: ID does not exist" containerID="cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b" Apr 23 08:18:36.480009 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.479960 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b"} err="failed to get container status \"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b\": rpc error: code = NotFound desc = could not find container \"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b\": container with ID starting with cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b not found: ID does not exist" Apr 23 08:18:36.480009 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.479985 2561 scope.go:117] "RemoveContainer" containerID="ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb" Apr 23 08:18:36.480231 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:18:36.480215 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb\": container with ID starting with ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb not found: ID does not exist" containerID="ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb" Apr 23 08:18:36.480309 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.480237 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb"} err="failed to get container status \"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb\": rpc error: code = NotFound desc = could not find container \"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb\": container with ID starting with ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb not found: ID does not exist" Apr 23 08:18:36.480309 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.480250 2561 scope.go:117] "RemoveContainer" containerID="a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb" Apr 23 08:18:36.480494 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:18:36.480480 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb\": container with ID starting with a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb not found: ID does not exist" containerID="a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb" Apr 23 08:18:36.480529 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.480498 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb"} err="failed to get container status \"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb\": rpc error: code = NotFound desc = could not find container \"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb\": container with ID starting with a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb not found: ID does not exist" Apr 23 08:18:36.480529 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.480511 2561 scope.go:117] "RemoveContainer" containerID="02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6" Apr 23 08:18:36.480719 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:18:36.480700 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6\": container with ID starting with 02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6 not found: ID does not exist" containerID="02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6" Apr 23 08:18:36.480760 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.480723 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6"} err="failed to get container status \"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6\": rpc error: code = NotFound desc = could not find container \"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6\": container with ID starting with 02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6 not found: ID does not exist" Apr 23 08:18:36.480760 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.480740 2561 scope.go:117] "RemoveContainer" containerID="d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08" Apr 23 08:18:36.480931 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:18:36.480914 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08\": container with ID starting with d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08 not found: ID does not exist" containerID="d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08" Apr 23 08:18:36.481007 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.480989 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08"} err="failed to get container status \"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08\": rpc error: code = NotFound desc = could not find container \"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08\": container with ID starting with d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08 not found: ID does not exist" Apr 23 08:18:36.481072 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481012 2561 scope.go:117] "RemoveContainer" containerID="2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2" Apr 23 08:18:36.481252 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:18:36.481233 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2\": container with ID starting with 2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2 not found: ID does not exist" containerID="2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2" Apr 23 08:18:36.481353 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481255 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2"} err="failed to get container status \"2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2\": rpc error: code = NotFound desc = could not find container \"2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2\": container with ID starting with 2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2 not found: ID does not exist" Apr 23 08:18:36.481353 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481287 2561 scope.go:117] "RemoveContainer" containerID="2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83" Apr 23 08:18:36.481544 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481519 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83"} err="failed to get container status \"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83\": rpc error: code = NotFound desc = could not find container \"2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83\": container with ID starting with 2dbee9fbf98dc4215292b9324e21ac122aadad69140bcf1e57aa3293022deb83 not found: ID does not exist" Apr 23 08:18:36.481591 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481545 2561 scope.go:117] "RemoveContainer" containerID="cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b" Apr 23 08:18:36.481758 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481739 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b"} err="failed to get container status \"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b\": rpc error: code = NotFound desc = could not find container \"cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b\": container with ID starting with cee468753e27d6c6f0ccdbfd0c483c6afe5638b6b34554d3ec474781bdb1058b not found: ID does not exist" Apr 23 08:18:36.481817 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481760 2561 scope.go:117] "RemoveContainer" containerID="ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb" Apr 23 08:18:36.481962 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481945 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb"} err="failed to get container status \"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb\": rpc error: code = NotFound desc = could not find container \"ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb\": container with ID starting with ea306a2e357e2fe44440fab3e83a04acd40bacfc9fcd71651d370f80975640cb not found: ID does not exist" Apr 23 08:18:36.482009 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.481962 2561 scope.go:117] "RemoveContainer" containerID="a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb" Apr 23 08:18:36.482130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.482116 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb"} err="failed to get container status \"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb\": rpc error: code = NotFound desc = could not find container \"a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb\": container with ID starting with a8bf86fe1b31d8ee70dc18597402542bb52ea74676b81ada2fcc25fa7251cdfb not found: ID does not exist" Apr 23 08:18:36.482179 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.482130 2561 scope.go:117] "RemoveContainer" containerID="02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6" Apr 23 08:18:36.482356 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.482337 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6"} err="failed to get container status \"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6\": rpc error: code = NotFound desc = could not find container \"02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6\": container with ID starting with 02288cad83fe5ef3f81e9a13ea416884f6d115942214c78569403ea93b96ccb6 not found: ID does not exist" Apr 23 08:18:36.482407 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.482357 2561 scope.go:117] "RemoveContainer" containerID="d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08" Apr 23 08:18:36.482573 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.482554 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08"} err="failed to get container status \"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08\": rpc error: code = NotFound desc = could not find container \"d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08\": container with ID starting with d148293d64be53cd74e0a0b4ff3df62ac83b813caf95d0c841c2402b316a6b08 not found: ID does not exist" Apr 23 08:18:36.482573 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.482572 2561 scope.go:117] "RemoveContainer" containerID="2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2" Apr 23 08:18:36.482892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.482860 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2"} err="failed to get container status \"2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2\": rpc error: code = NotFound desc = could not find container \"2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2\": container with ID starting with 2f1fb9308b03bb0a3f0463e5e82f892716a830bcbc2c85da56eb2d711fb0cfc2 not found: ID does not exist" Apr 23 08:18:36.494236 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494219 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:18:36.494559 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494546 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="alertmanager" Apr 23 08:18:36.494598 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494561 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="alertmanager" Apr 23 08:18:36.494598 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494569 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy-web" Apr 23 08:18:36.494598 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494574 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy-web" Apr 23 08:18:36.494598 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494585 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="init-config-reloader" Apr 23 08:18:36.494598 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494590 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="init-config-reloader" Apr 23 08:18:36.494598 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494597 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="config-reloader" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494602 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="config-reloader" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494613 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy-metric" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494619 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy-metric" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494625 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494631 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494640 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="prom-label-proxy" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494645 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="prom-label-proxy" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494690 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="config-reloader" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494701 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494709 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy-web" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494716 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="prom-label-proxy" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494723 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="kube-rbac-proxy-metric" Apr 23 08:18:36.494771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.494729 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" containerName="alertmanager" Apr 23 08:18:36.499786 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.499772 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.502532 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.502514 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 08:18:36.502687 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.502670 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 08:18:36.502760 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.502724 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 08:18:36.502810 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.502771 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 08:18:36.502901 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.502885 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 08:18:36.502973 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.502956 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 08:18:36.503024 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.502982 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 08:18:36.503072 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.503023 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-z9nvw\"" Apr 23 08:18:36.503123 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.503089 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 08:18:36.508575 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.508558 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 08:18:36.511560 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.511541 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:18:36.583224 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583191 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e7d4824-41fb-4a7a-959b-dc8a61e71187-config-out\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583224 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583223 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e7d4824-41fb-4a7a-959b-dc8a61e71187-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583243 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-web-config\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583274 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583336 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e7d4824-41fb-4a7a-959b-dc8a61e71187-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583374 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583392 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e7d4824-41fb-4a7a-959b-dc8a61e71187-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583414 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583435 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583477 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583551 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583595 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhl7q\" (UniqueName: \"kubernetes.io/projected/8e7d4824-41fb-4a7a-959b-dc8a61e71187-kube-api-access-jhl7q\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.583706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.583634 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e7d4824-41fb-4a7a-959b-dc8a61e71187-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.664875 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.664795 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb15804e-aa87-4672-93ff-da7e97217b1f" path="/var/lib/kubelet/pods/fb15804e-aa87-4672-93ff-da7e97217b1f/volumes" Apr 23 08:18:36.684695 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684654 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.684695 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684695 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhl7q\" (UniqueName: \"kubernetes.io/projected/8e7d4824-41fb-4a7a-959b-dc8a61e71187-kube-api-access-jhl7q\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.684942 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684723 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e7d4824-41fb-4a7a-959b-dc8a61e71187-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.684942 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684773 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e7d4824-41fb-4a7a-959b-dc8a61e71187-config-out\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.684942 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684797 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e7d4824-41fb-4a7a-959b-dc8a61e71187-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.684942 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684824 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-web-config\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.684942 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684851 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.684942 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684894 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e7d4824-41fb-4a7a-959b-dc8a61e71187-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.684942 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684931 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.685292 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684954 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e7d4824-41fb-4a7a-959b-dc8a61e71187-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.685292 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.684991 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.685292 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.685030 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.685292 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.685063 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.685662 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.685596 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e7d4824-41fb-4a7a-959b-dc8a61e71187-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.686106 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.685768 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e7d4824-41fb-4a7a-959b-dc8a61e71187-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.687866 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.687644 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e7d4824-41fb-4a7a-959b-dc8a61e71187-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.687866 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.687699 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e7d4824-41fb-4a7a-959b-dc8a61e71187-config-out\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.687866 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.687770 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-web-config\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.687866 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.687857 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.688118 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.687959 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e7d4824-41fb-4a7a-959b-dc8a61e71187-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.688180 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.688122 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.688239 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.688215 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.688515 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.688492 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.689027 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.689004 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.689537 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.689519 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e7d4824-41fb-4a7a-959b-dc8a61e71187-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.692569 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.692549 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhl7q\" (UniqueName: \"kubernetes.io/projected/8e7d4824-41fb-4a7a-959b-dc8a61e71187-kube-api-access-jhl7q\") pod \"alertmanager-main-0\" (UID: \"8e7d4824-41fb-4a7a-959b-dc8a61e71187\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.809448 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.809406 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:18:36.934361 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:36.934335 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:18:36.936861 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:18:36.936835 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7d4824_41fb_4a7a_959b_dc8a61e71187.slice/crio-ec8bee38d100325a893552f114927fae2b60a3bdda7556675c7449f1b46938c1 WatchSource:0}: Error finding container ec8bee38d100325a893552f114927fae2b60a3bdda7556675c7449f1b46938c1: Status 404 returned error can't find the container with id ec8bee38d100325a893552f114927fae2b60a3bdda7556675c7449f1b46938c1 Apr 23 08:18:37.437964 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:37.437926 2561 generic.go:358] "Generic (PLEG): container finished" podID="8e7d4824-41fb-4a7a-959b-dc8a61e71187" containerID="6b1badaef30b9ef12f17190fdc8cb7ec9708beabcd0da4c626c76a9539365290" exitCode=0 Apr 23 08:18:37.438420 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:37.438018 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e7d4824-41fb-4a7a-959b-dc8a61e71187","Type":"ContainerDied","Data":"6b1badaef30b9ef12f17190fdc8cb7ec9708beabcd0da4c626c76a9539365290"} Apr 23 08:18:37.438420 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:37.438057 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e7d4824-41fb-4a7a-959b-dc8a61e71187","Type":"ContainerStarted","Data":"ec8bee38d100325a893552f114927fae2b60a3bdda7556675c7449f1b46938c1"} Apr 23 08:18:38.445301 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:38.445251 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e7d4824-41fb-4a7a-959b-dc8a61e71187","Type":"ContainerStarted","Data":"37d55bcbd8cd9025fc0d2cbaaea1fdedf02526c1120acca25748390c47658262"} Apr 23 08:18:38.445301 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:38.445301 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e7d4824-41fb-4a7a-959b-dc8a61e71187","Type":"ContainerStarted","Data":"dc46bd1cf6c293a0d6e2fcdb77aebdff4ff84277a3f0b84b3d2c9f7b2a9e9131"} Apr 23 08:18:38.445710 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:38.445313 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e7d4824-41fb-4a7a-959b-dc8a61e71187","Type":"ContainerStarted","Data":"b651ee7267b1ce875c8c2fc69ceef2d768d086aa287f0423ae1487721488ad15"} Apr 23 08:18:38.445710 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:38.445320 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e7d4824-41fb-4a7a-959b-dc8a61e71187","Type":"ContainerStarted","Data":"7a7b73430bef9be91a942f7cc1a8bb1b992d98820f11f0d86a6d57bdbefc86dd"} Apr 23 08:18:38.445710 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:38.445328 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e7d4824-41fb-4a7a-959b-dc8a61e71187","Type":"ContainerStarted","Data":"92d838ffa0430c4a6df105e4fc703d98ae627b1f1feedfcd6df6b985202297bd"} Apr 23 08:18:38.445710 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:38.445335 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e7d4824-41fb-4a7a-959b-dc8a61e71187","Type":"ContainerStarted","Data":"645bd37ce7b91a2f7378d558af030af02a9cd3c725759e8ee575f921f7952b60"} Apr 23 08:18:38.472895 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:38.472854 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.472841582 podStartE2EDuration="2.472841582s" podCreationTimestamp="2026-04-23 08:18:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:18:38.470625275 +0000 UTC m=+240.401988088" watchObservedRunningTime="2026-04-23 08:18:38.472841582 +0000 UTC m=+240.404204383" Apr 23 08:18:44.521557 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:44.521528 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:44.521935 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:44.521598 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:44.526058 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:44.526039 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:45.469025 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:45.468998 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:18:45.511593 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:45.511568 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8f5ff9678-d7ctm"] Apr 23 08:18:50.496368 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:50.496335 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:18:50.498865 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:50.498837 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e92a791e-42ac-4855-b7b5-945f53108891-metrics-certs\") pod \"network-metrics-daemon-pmv55\" (UID: \"e92a791e-42ac-4855-b7b5-945f53108891\") " pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:18:50.763796 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:50.763720 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vjj4z\"" Apr 23 08:18:50.771926 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:50.771901 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmv55" Apr 23 08:18:50.887063 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:50.887035 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pmv55"] Apr 23 08:18:50.890105 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:18:50.890070 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode92a791e_42ac_4855_b7b5_945f53108891.slice/crio-39531782907e285dda3fcb0d35f641bb371db4407f96dfb098e06099d11c45a4 WatchSource:0}: Error finding container 39531782907e285dda3fcb0d35f641bb371db4407f96dfb098e06099d11c45a4: Status 404 returned error can't find the container with id 39531782907e285dda3fcb0d35f641bb371db4407f96dfb098e06099d11c45a4 Apr 23 08:18:51.482506 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:51.482468 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pmv55" event={"ID":"e92a791e-42ac-4855-b7b5-945f53108891","Type":"ContainerStarted","Data":"39531782907e285dda3fcb0d35f641bb371db4407f96dfb098e06099d11c45a4"} Apr 23 08:18:52.487721 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:52.487687 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pmv55" event={"ID":"e92a791e-42ac-4855-b7b5-945f53108891","Type":"ContainerStarted","Data":"eb2c80893ac5624b52c27e7a40448f5041169a23c1c961966c81272e3ab731e6"} Apr 23 08:18:52.488071 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:52.487726 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pmv55" event={"ID":"e92a791e-42ac-4855-b7b5-945f53108891","Type":"ContainerStarted","Data":"48f8b226ef2fd900d5a76b3250e4edf6aea717d021d54d031dc10fdcc8c922ee"} Apr 23 08:18:52.506001 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:18:52.505955 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pmv55" podStartSLOduration=253.327561224 podStartE2EDuration="4m14.505940446s" podCreationTimestamp="2026-04-23 08:14:38 +0000 UTC" firstStartedPulling="2026-04-23 08:18:50.891949003 +0000 UTC m=+252.823311785" lastFinishedPulling="2026-04-23 08:18:52.070328223 +0000 UTC m=+254.001691007" observedRunningTime="2026-04-23 08:18:52.505305639 +0000 UTC m=+254.436668441" watchObservedRunningTime="2026-04-23 08:18:52.505940446 +0000 UTC m=+254.437303247" Apr 23 08:19:10.533839 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.533739 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8f5ff9678-d7ctm" podUID="5f75f47e-3806-4d72-88e2-7b0d59316df6" containerName="console" containerID="cri-o://88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131" gracePeriod=15 Apr 23 08:19:10.778750 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.778731 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8f5ff9678-d7ctm_5f75f47e-3806-4d72-88e2-7b0d59316df6/console/0.log" Apr 23 08:19:10.778846 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.778787 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:19:10.861439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861365 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-oauth-config\") pod \"5f75f47e-3806-4d72-88e2-7b0d59316df6\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " Apr 23 08:19:10.861439 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861421 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-serving-cert\") pod \"5f75f47e-3806-4d72-88e2-7b0d59316df6\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " Apr 23 08:19:10.861623 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861447 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-service-ca\") pod \"5f75f47e-3806-4d72-88e2-7b0d59316df6\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " Apr 23 08:19:10.861623 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861470 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-config\") pod \"5f75f47e-3806-4d72-88e2-7b0d59316df6\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " Apr 23 08:19:10.861623 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861506 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drhls\" (UniqueName: \"kubernetes.io/projected/5f75f47e-3806-4d72-88e2-7b0d59316df6-kube-api-access-drhls\") pod \"5f75f47e-3806-4d72-88e2-7b0d59316df6\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " Apr 23 08:19:10.861780 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861703 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-trusted-ca-bundle\") pod \"5f75f47e-3806-4d72-88e2-7b0d59316df6\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " Apr 23 08:19:10.861780 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861757 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-oauth-serving-cert\") pod \"5f75f47e-3806-4d72-88e2-7b0d59316df6\" (UID: \"5f75f47e-3806-4d72-88e2-7b0d59316df6\") " Apr 23 08:19:10.861913 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861881 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-config" (OuterVolumeSpecName: "console-config") pod "5f75f47e-3806-4d72-88e2-7b0d59316df6" (UID: "5f75f47e-3806-4d72-88e2-7b0d59316df6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:19:10.862017 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.861883 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-service-ca" (OuterVolumeSpecName: "service-ca") pod "5f75f47e-3806-4d72-88e2-7b0d59316df6" (UID: "5f75f47e-3806-4d72-88e2-7b0d59316df6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:19:10.862164 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.862147 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-service-ca\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:19:10.862226 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.862174 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-config\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:19:10.862226 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.862140 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5f75f47e-3806-4d72-88e2-7b0d59316df6" (UID: "5f75f47e-3806-4d72-88e2-7b0d59316df6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:19:10.862226 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.862192 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5f75f47e-3806-4d72-88e2-7b0d59316df6" (UID: "5f75f47e-3806-4d72-88e2-7b0d59316df6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:19:10.863562 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.863539 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5f75f47e-3806-4d72-88e2-7b0d59316df6" (UID: "5f75f47e-3806-4d72-88e2-7b0d59316df6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:19:10.863689 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.863582 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5f75f47e-3806-4d72-88e2-7b0d59316df6" (UID: "5f75f47e-3806-4d72-88e2-7b0d59316df6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:19:10.863689 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.863628 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f75f47e-3806-4d72-88e2-7b0d59316df6-kube-api-access-drhls" (OuterVolumeSpecName: "kube-api-access-drhls") pod "5f75f47e-3806-4d72-88e2-7b0d59316df6" (UID: "5f75f47e-3806-4d72-88e2-7b0d59316df6"). InnerVolumeSpecName "kube-api-access-drhls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:19:10.963232 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.963210 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-trusted-ca-bundle\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:19:10.963232 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.963229 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f75f47e-3806-4d72-88e2-7b0d59316df6-oauth-serving-cert\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:19:10.963379 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.963239 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-oauth-config\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:19:10.963379 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.963247 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f75f47e-3806-4d72-88e2-7b0d59316df6-console-serving-cert\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:19:10.963379 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:10.963256 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-drhls\" (UniqueName: \"kubernetes.io/projected/5f75f47e-3806-4d72-88e2-7b0d59316df6-kube-api-access-drhls\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:19:11.546749 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.546723 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8f5ff9678-d7ctm_5f75f47e-3806-4d72-88e2-7b0d59316df6/console/0.log" Apr 23 08:19:11.547238 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.546761 2561 generic.go:358] "Generic (PLEG): container finished" podID="5f75f47e-3806-4d72-88e2-7b0d59316df6" containerID="88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131" exitCode=2 Apr 23 08:19:11.547238 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.546815 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8f5ff9678-d7ctm" event={"ID":"5f75f47e-3806-4d72-88e2-7b0d59316df6","Type":"ContainerDied","Data":"88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131"} Apr 23 08:19:11.547238 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.546839 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8f5ff9678-d7ctm" event={"ID":"5f75f47e-3806-4d72-88e2-7b0d59316df6","Type":"ContainerDied","Data":"5dced97f5721cf82dcee355b76a012be3fc869016506b44d9cc6a6f5e3488830"} Apr 23 08:19:11.547238 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.546843 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8f5ff9678-d7ctm" Apr 23 08:19:11.547238 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.546853 2561 scope.go:117] "RemoveContainer" containerID="88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131" Apr 23 08:19:11.557846 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.557824 2561 scope.go:117] "RemoveContainer" containerID="88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131" Apr 23 08:19:11.558093 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:19:11.558067 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131\": container with ID starting with 88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131 not found: ID does not exist" containerID="88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131" Apr 23 08:19:11.558142 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.558103 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131"} err="failed to get container status \"88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131\": rpc error: code = NotFound desc = could not find container \"88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131\": container with ID starting with 88fd90a5cf2a2beb1a35c0efa1a2f951c67087694da7e3c040fce2ecc5e01131 not found: ID does not exist" Apr 23 08:19:11.573736 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.573714 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8f5ff9678-d7ctm"] Apr 23 08:19:11.576591 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:11.576570 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8f5ff9678-d7ctm"] Apr 23 08:19:12.664214 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:12.664176 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f75f47e-3806-4d72-88e2-7b0d59316df6" path="/var/lib/kubelet/pods/5f75f47e-3806-4d72-88e2-7b0d59316df6/volumes" Apr 23 08:19:16.002547 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.002516 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-t4t69"] Apr 23 08:19:16.002914 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.002829 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f75f47e-3806-4d72-88e2-7b0d59316df6" containerName="console" Apr 23 08:19:16.002914 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.002839 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f75f47e-3806-4d72-88e2-7b0d59316df6" containerName="console" Apr 23 08:19:16.002914 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.002900 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f75f47e-3806-4d72-88e2-7b0d59316df6" containerName="console" Apr 23 08:19:16.007939 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.007918 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.010393 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.010374 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:19:16.013731 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.013708 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t4t69"] Apr 23 08:19:16.103443 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.103370 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d49c3330-f379-49e7-92d4-30be09c51dd6-kubelet-config\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.103443 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.103404 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d49c3330-f379-49e7-92d4-30be09c51dd6-original-pull-secret\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.103650 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.103504 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d49c3330-f379-49e7-92d4-30be09c51dd6-dbus\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.204845 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.204798 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d49c3330-f379-49e7-92d4-30be09c51dd6-dbus\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.205015 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.204863 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d49c3330-f379-49e7-92d4-30be09c51dd6-kubelet-config\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.205015 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.204884 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d49c3330-f379-49e7-92d4-30be09c51dd6-original-pull-secret\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.205015 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.204938 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d49c3330-f379-49e7-92d4-30be09c51dd6-kubelet-config\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.205015 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.204988 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d49c3330-f379-49e7-92d4-30be09c51dd6-dbus\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.209279 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.208092 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d49c3330-f379-49e7-92d4-30be09c51dd6-original-pull-secret\") pod \"global-pull-secret-syncer-t4t69\" (UID: \"d49c3330-f379-49e7-92d4-30be09c51dd6\") " pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.318088 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.318036 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t4t69" Apr 23 08:19:16.433049 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.433027 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t4t69"] Apr 23 08:19:16.435690 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:19:16.435659 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd49c3330_f379_49e7_92d4_30be09c51dd6.slice/crio-076d98df883b3bf2fe5ebecf31e99e987aaebeea9364b0d71b6eb33199794f3c WatchSource:0}: Error finding container 076d98df883b3bf2fe5ebecf31e99e987aaebeea9364b0d71b6eb33199794f3c: Status 404 returned error can't find the container with id 076d98df883b3bf2fe5ebecf31e99e987aaebeea9364b0d71b6eb33199794f3c Apr 23 08:19:16.563855 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:16.563820 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t4t69" event={"ID":"d49c3330-f379-49e7-92d4-30be09c51dd6","Type":"ContainerStarted","Data":"076d98df883b3bf2fe5ebecf31e99e987aaebeea9364b0d71b6eb33199794f3c"} Apr 23 08:19:21.582203 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:21.582167 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t4t69" event={"ID":"d49c3330-f379-49e7-92d4-30be09c51dd6","Type":"ContainerStarted","Data":"7371df302d28b21220354a7cc08020f208c10bf17a36bd5f5429b6033a0d3fda"} Apr 23 08:19:21.600181 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:21.600137 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-t4t69" podStartSLOduration=2.341144288 podStartE2EDuration="6.600124452s" podCreationTimestamp="2026-04-23 08:19:15 +0000 UTC" firstStartedPulling="2026-04-23 08:19:16.437310358 +0000 UTC m=+278.368673142" lastFinishedPulling="2026-04-23 08:19:20.696290524 +0000 UTC m=+282.627653306" observedRunningTime="2026-04-23 08:19:21.598801473 +0000 UTC m=+283.530164274" watchObservedRunningTime="2026-04-23 08:19:21.600124452 +0000 UTC m=+283.531487252" Apr 23 08:19:38.543170 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:38.543138 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:19:38.543771 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:38.543632 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:19:38.553922 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:38.553903 2561 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:19:42.253308 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.253276 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974"] Apr 23 08:19:42.257254 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.257232 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.259846 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.259823 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8drkb\"" Apr 23 08:19:42.259953 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.259844 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 08:19:42.261066 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.261049 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 08:19:42.264375 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.264350 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974"] Apr 23 08:19:42.317060 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.317037 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.317175 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.317088 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzmw\" (UniqueName: \"kubernetes.io/projected/40565907-63a8-4c4b-b783-ca4df0c39e61-kube-api-access-5vzmw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.317175 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.317152 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.418312 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.418257 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.418403 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.418354 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzmw\" (UniqueName: \"kubernetes.io/projected/40565907-63a8-4c4b-b783-ca4df0c39e61-kube-api-access-5vzmw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.418403 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.418391 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.418703 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.418684 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.418739 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.418695 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.426663 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.426636 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzmw\" (UniqueName: \"kubernetes.io/projected/40565907-63a8-4c4b-b783-ca4df0c39e61-kube-api-access-5vzmw\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.568740 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.568661 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:19:42.687758 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.687727 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974"] Apr 23 08:19:42.690666 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:19:42.690638 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40565907_63a8_4c4b_b783_ca4df0c39e61.slice/crio-0b2ae6c38860a318f7345f5aa7e398f4481a8e07e04cacd03ff78aef6078de87 WatchSource:0}: Error finding container 0b2ae6c38860a318f7345f5aa7e398f4481a8e07e04cacd03ff78aef6078de87: Status 404 returned error can't find the container with id 0b2ae6c38860a318f7345f5aa7e398f4481a8e07e04cacd03ff78aef6078de87 Apr 23 08:19:42.692512 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:42.692495 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:19:43.647748 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:43.647715 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" event={"ID":"40565907-63a8-4c4b-b783-ca4df0c39e61","Type":"ContainerStarted","Data":"0b2ae6c38860a318f7345f5aa7e398f4481a8e07e04cacd03ff78aef6078de87"} Apr 23 08:19:50.669334 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:50.669298 2561 generic.go:358] "Generic (PLEG): container finished" podID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerID="c7af9c1e1169ade248e2738a9b33f3a4179de7abdae510efc8166557c8b7f9fc" exitCode=0 Apr 23 08:19:50.669334 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:50.669334 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" event={"ID":"40565907-63a8-4c4b-b783-ca4df0c39e61","Type":"ContainerDied","Data":"c7af9c1e1169ade248e2738a9b33f3a4179de7abdae510efc8166557c8b7f9fc"} Apr 23 08:19:53.679626 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:53.679589 2561 generic.go:358] "Generic (PLEG): container finished" podID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerID="0f28b20034c98bbda53eb998d5f955f065f94575fc6e894b32d7777685efc209" exitCode=0 Apr 23 08:19:53.679626 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:19:53.679628 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" event={"ID":"40565907-63a8-4c4b-b783-ca4df0c39e61","Type":"ContainerDied","Data":"0f28b20034c98bbda53eb998d5f955f065f94575fc6e894b32d7777685efc209"} Apr 23 08:20:00.704019 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:00.703987 2561 generic.go:358] "Generic (PLEG): container finished" podID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerID="91388b26f3d9c969b87a1441151276f8d16c75697f6af9042290c84b57d65c43" exitCode=0 Apr 23 08:20:00.704428 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:00.704057 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" event={"ID":"40565907-63a8-4c4b-b783-ca4df0c39e61","Type":"ContainerDied","Data":"91388b26f3d9c969b87a1441151276f8d16c75697f6af9042290c84b57d65c43"} Apr 23 08:20:01.831470 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.831446 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:20:01.889098 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.889065 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-util\") pod \"40565907-63a8-4c4b-b783-ca4df0c39e61\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " Apr 23 08:20:01.889314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.889117 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vzmw\" (UniqueName: \"kubernetes.io/projected/40565907-63a8-4c4b-b783-ca4df0c39e61-kube-api-access-5vzmw\") pod \"40565907-63a8-4c4b-b783-ca4df0c39e61\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " Apr 23 08:20:01.889314 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.889168 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-bundle\") pod \"40565907-63a8-4c4b-b783-ca4df0c39e61\" (UID: \"40565907-63a8-4c4b-b783-ca4df0c39e61\") " Apr 23 08:20:01.889973 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.889944 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-bundle" (OuterVolumeSpecName: "bundle") pod "40565907-63a8-4c4b-b783-ca4df0c39e61" (UID: "40565907-63a8-4c4b-b783-ca4df0c39e61"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:20:01.891375 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.891352 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40565907-63a8-4c4b-b783-ca4df0c39e61-kube-api-access-5vzmw" (OuterVolumeSpecName: "kube-api-access-5vzmw") pod "40565907-63a8-4c4b-b783-ca4df0c39e61" (UID: "40565907-63a8-4c4b-b783-ca4df0c39e61"). InnerVolumeSpecName "kube-api-access-5vzmw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:20:01.893488 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.893463 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-util" (OuterVolumeSpecName: "util") pod "40565907-63a8-4c4b-b783-ca4df0c39e61" (UID: "40565907-63a8-4c4b-b783-ca4df0c39e61"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:20:01.989769 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.989688 2561 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-util\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:20:01.989769 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.989718 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vzmw\" (UniqueName: \"kubernetes.io/projected/40565907-63a8-4c4b-b783-ca4df0c39e61-kube-api-access-5vzmw\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:20:01.989769 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:01.989730 2561 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40565907-63a8-4c4b-b783-ca4df0c39e61-bundle\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:20:02.710955 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:02.710915 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" event={"ID":"40565907-63a8-4c4b-b783-ca4df0c39e61","Type":"ContainerDied","Data":"0b2ae6c38860a318f7345f5aa7e398f4481a8e07e04cacd03ff78aef6078de87"} Apr 23 08:20:02.710955 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:02.710960 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b2ae6c38860a318f7345f5aa7e398f4481a8e07e04cacd03ff78aef6078de87" Apr 23 08:20:02.711160 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:02.710964 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dx5974" Apr 23 08:20:10.341573 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.341534 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh"] Apr 23 08:20:10.342066 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.342047 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerName="pull" Apr 23 08:20:10.342145 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.342069 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerName="pull" Apr 23 08:20:10.342145 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.342092 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerName="util" Apr 23 08:20:10.342145 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.342101 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerName="util" Apr 23 08:20:10.342145 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.342118 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerName="extract" Apr 23 08:20:10.342145 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.342127 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerName="extract" Apr 23 08:20:10.342409 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.342213 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="40565907-63a8-4c4b-b783-ca4df0c39e61" containerName="extract" Apr 23 08:20:10.346618 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.346596 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" Apr 23 08:20:10.350792 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.350772 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 23 08:20:10.350922 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.350905 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-sppzs\"" Apr 23 08:20:10.351310 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.351295 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:20:10.359094 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.359065 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh"] Apr 23 08:20:10.458922 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.458891 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17b4f6a4-06e8-4482-8a2c-37943f35671b-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-7n2hh\" (UID: \"17b4f6a4-06e8-4482-8a2c-37943f35671b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" Apr 23 08:20:10.458922 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.458924 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plfcm\" (UniqueName: \"kubernetes.io/projected/17b4f6a4-06e8-4482-8a2c-37943f35671b-kube-api-access-plfcm\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-7n2hh\" (UID: \"17b4f6a4-06e8-4482-8a2c-37943f35671b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" Apr 23 08:20:10.560024 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.559989 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17b4f6a4-06e8-4482-8a2c-37943f35671b-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-7n2hh\" (UID: \"17b4f6a4-06e8-4482-8a2c-37943f35671b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" Apr 23 08:20:10.560188 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.560030 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plfcm\" (UniqueName: \"kubernetes.io/projected/17b4f6a4-06e8-4482-8a2c-37943f35671b-kube-api-access-plfcm\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-7n2hh\" (UID: \"17b4f6a4-06e8-4482-8a2c-37943f35671b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" Apr 23 08:20:10.560413 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.560392 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17b4f6a4-06e8-4482-8a2c-37943f35671b-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-7n2hh\" (UID: \"17b4f6a4-06e8-4482-8a2c-37943f35671b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" Apr 23 08:20:10.570010 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.569990 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plfcm\" (UniqueName: \"kubernetes.io/projected/17b4f6a4-06e8-4482-8a2c-37943f35671b-kube-api-access-plfcm\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-7n2hh\" (UID: \"17b4f6a4-06e8-4482-8a2c-37943f35671b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" Apr 23 08:20:10.655755 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.655692 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" Apr 23 08:20:10.775069 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:10.775046 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh"] Apr 23 08:20:10.777322 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:20:10.777295 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b4f6a4_06e8_4482_8a2c_37943f35671b.slice/crio-3da97c7763598543c2461121a5f4e6a45a349034ca375cdc3e6f8c585ea3b800 WatchSource:0}: Error finding container 3da97c7763598543c2461121a5f4e6a45a349034ca375cdc3e6f8c585ea3b800: Status 404 returned error can't find the container with id 3da97c7763598543c2461121a5f4e6a45a349034ca375cdc3e6f8c585ea3b800 Apr 23 08:20:11.740944 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:11.740896 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" event={"ID":"17b4f6a4-06e8-4482-8a2c-37943f35671b","Type":"ContainerStarted","Data":"3da97c7763598543c2461121a5f4e6a45a349034ca375cdc3e6f8c585ea3b800"} Apr 23 08:20:12.746256 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:12.746224 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" event={"ID":"17b4f6a4-06e8-4482-8a2c-37943f35671b","Type":"ContainerStarted","Data":"49f730c0b24fa8b3e138c57434a813f9abd7f8f6b7fcfd6da8fc0d64f46c40ba"} Apr 23 08:20:12.770779 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:12.770676 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-7n2hh" podStartSLOduration=1.255008883 podStartE2EDuration="2.770659863s" podCreationTimestamp="2026-04-23 08:20:10 +0000 UTC" firstStartedPulling="2026-04-23 08:20:10.779653015 +0000 UTC m=+332.711015798" lastFinishedPulling="2026-04-23 08:20:12.295303984 +0000 UTC m=+334.226666778" observedRunningTime="2026-04-23 08:20:12.768594477 +0000 UTC m=+334.699957289" watchObservedRunningTime="2026-04-23 08:20:12.770659863 +0000 UTC m=+334.702022664" Apr 23 08:20:14.919401 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:14.919371 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-xhkgr"] Apr 23 08:20:14.922763 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:14.922747 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:14.925502 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:14.925484 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bkf5j\"" Apr 23 08:20:14.926775 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:14.926758 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 08:20:14.926775 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:14.926766 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 08:20:14.934448 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:14.934426 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-xhkgr"] Apr 23 08:20:15.000075 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.000048 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkvv2\" (UniqueName: \"kubernetes.io/projected/156f5ba2-51c1-4858-bd11-e5fe9aae1cb8-kube-api-access-gkvv2\") pod \"cert-manager-webhook-587ccfb98-xhkgr\" (UID: \"156f5ba2-51c1-4858-bd11-e5fe9aae1cb8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:15.000190 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.000090 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/156f5ba2-51c1-4858-bd11-e5fe9aae1cb8-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-xhkgr\" (UID: \"156f5ba2-51c1-4858-bd11-e5fe9aae1cb8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:15.100892 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.100852 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkvv2\" (UniqueName: \"kubernetes.io/projected/156f5ba2-51c1-4858-bd11-e5fe9aae1cb8-kube-api-access-gkvv2\") pod \"cert-manager-webhook-587ccfb98-xhkgr\" (UID: \"156f5ba2-51c1-4858-bd11-e5fe9aae1cb8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:15.101009 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.100898 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/156f5ba2-51c1-4858-bd11-e5fe9aae1cb8-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-xhkgr\" (UID: \"156f5ba2-51c1-4858-bd11-e5fe9aae1cb8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:15.108909 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.108885 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/156f5ba2-51c1-4858-bd11-e5fe9aae1cb8-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-xhkgr\" (UID: \"156f5ba2-51c1-4858-bd11-e5fe9aae1cb8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:15.109171 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.109152 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkvv2\" (UniqueName: \"kubernetes.io/projected/156f5ba2-51c1-4858-bd11-e5fe9aae1cb8-kube-api-access-gkvv2\") pod \"cert-manager-webhook-587ccfb98-xhkgr\" (UID: \"156f5ba2-51c1-4858-bd11-e5fe9aae1cb8\") " pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:15.248844 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.248817 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:15.366250 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.366227 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-xhkgr"] Apr 23 08:20:15.368329 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:20:15.368302 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod156f5ba2_51c1_4858_bd11_e5fe9aae1cb8.slice/crio-fbb0f89879598e6b3cb71d12829a4dcc29bb9cdd1f5af9368d91cf54d82d0a23 WatchSource:0}: Error finding container fbb0f89879598e6b3cb71d12829a4dcc29bb9cdd1f5af9368d91cf54d82d0a23: Status 404 returned error can't find the container with id fbb0f89879598e6b3cb71d12829a4dcc29bb9cdd1f5af9368d91cf54d82d0a23 Apr 23 08:20:15.762821 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:15.762784 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" event={"ID":"156f5ba2-51c1-4858-bd11-e5fe9aae1cb8","Type":"ContainerStarted","Data":"fbb0f89879598e6b3cb71d12829a4dcc29bb9cdd1f5af9368d91cf54d82d0a23"} Apr 23 08:20:18.478747 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.478716 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-62mbp"] Apr 23 08:20:18.482038 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.482021 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" Apr 23 08:20:18.484510 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.484489 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-lg7ls\"" Apr 23 08:20:18.488040 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.488016 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-62mbp"] Apr 23 08:20:18.630827 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.630792 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26929d2c-c6ec-4e9e-95e6-a2dda5985e3b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-62mbp\" (UID: \"26929d2c-c6ec-4e9e-95e6-a2dda5985e3b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" Apr 23 08:20:18.630989 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.630833 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cv6n\" (UniqueName: \"kubernetes.io/projected/26929d2c-c6ec-4e9e-95e6-a2dda5985e3b-kube-api-access-7cv6n\") pod \"cert-manager-cainjector-68b757865b-62mbp\" (UID: \"26929d2c-c6ec-4e9e-95e6-a2dda5985e3b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" Apr 23 08:20:18.732096 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.732004 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26929d2c-c6ec-4e9e-95e6-a2dda5985e3b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-62mbp\" (UID: \"26929d2c-c6ec-4e9e-95e6-a2dda5985e3b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" Apr 23 08:20:18.732096 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.732052 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cv6n\" (UniqueName: \"kubernetes.io/projected/26929d2c-c6ec-4e9e-95e6-a2dda5985e3b-kube-api-access-7cv6n\") pod \"cert-manager-cainjector-68b757865b-62mbp\" (UID: \"26929d2c-c6ec-4e9e-95e6-a2dda5985e3b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" Apr 23 08:20:18.742018 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.741986 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26929d2c-c6ec-4e9e-95e6-a2dda5985e3b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-62mbp\" (UID: \"26929d2c-c6ec-4e9e-95e6-a2dda5985e3b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" Apr 23 08:20:18.742158 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.742140 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cv6n\" (UniqueName: \"kubernetes.io/projected/26929d2c-c6ec-4e9e-95e6-a2dda5985e3b-kube-api-access-7cv6n\") pod \"cert-manager-cainjector-68b757865b-62mbp\" (UID: \"26929d2c-c6ec-4e9e-95e6-a2dda5985e3b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" Apr 23 08:20:18.774228 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.774196 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" event={"ID":"156f5ba2-51c1-4858-bd11-e5fe9aae1cb8","Type":"ContainerStarted","Data":"9812491d5a7e7eefd5a70054fdf7ab9bd505545252d49fa93885eb49352e9f64"} Apr 23 08:20:18.774349 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.774326 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:18.791056 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.791018 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" podStartSLOduration=2.345060953 podStartE2EDuration="4.791006462s" podCreationTimestamp="2026-04-23 08:20:14 +0000 UTC" firstStartedPulling="2026-04-23 08:20:15.370178496 +0000 UTC m=+337.301541278" lastFinishedPulling="2026-04-23 08:20:17.816124008 +0000 UTC m=+339.747486787" observedRunningTime="2026-04-23 08:20:18.790382186 +0000 UTC m=+340.721744999" watchObservedRunningTime="2026-04-23 08:20:18.791006462 +0000 UTC m=+340.722369259" Apr 23 08:20:18.791703 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.791684 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" Apr 23 08:20:18.930301 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:18.930257 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-62mbp"] Apr 23 08:20:18.930860 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:20:18.930835 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26929d2c_c6ec_4e9e_95e6_a2dda5985e3b.slice/crio-d2c90a83087b9d146559d70e1b27e3f72e468ca088485ad17e70c086bd1a1474 WatchSource:0}: Error finding container d2c90a83087b9d146559d70e1b27e3f72e468ca088485ad17e70c086bd1a1474: Status 404 returned error can't find the container with id d2c90a83087b9d146559d70e1b27e3f72e468ca088485ad17e70c086bd1a1474 Apr 23 08:20:19.778300 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:19.778248 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" event={"ID":"26929d2c-c6ec-4e9e-95e6-a2dda5985e3b","Type":"ContainerStarted","Data":"63fda214252af527cbb9a9d358eee0d38495dce62fa55fa047d7976b6fafe86a"} Apr 23 08:20:19.778300 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:19.778303 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" event={"ID":"26929d2c-c6ec-4e9e-95e6-a2dda5985e3b","Type":"ContainerStarted","Data":"d2c90a83087b9d146559d70e1b27e3f72e468ca088485ad17e70c086bd1a1474"} Apr 23 08:20:19.795130 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:19.795085 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-62mbp" podStartSLOduration=1.795071932 podStartE2EDuration="1.795071932s" podCreationTimestamp="2026-04-23 08:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:20:19.794090101 +0000 UTC m=+341.725452926" watchObservedRunningTime="2026-04-23 08:20:19.795071932 +0000 UTC m=+341.726434734" Apr 23 08:20:24.780791 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:24.780701 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-xhkgr" Apr 23 08:20:30.082788 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.082760 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp"] Apr 23 08:20:30.105103 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.105076 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp"] Apr 23 08:20:30.105256 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.105184 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.108067 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.108045 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 08:20:30.109276 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.109242 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8drkb\"" Apr 23 08:20:30.109383 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.109243 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 08:20:30.119238 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.119211 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd52g\" (UniqueName: \"kubernetes.io/projected/96df3282-e881-4237-9de5-7fd93364188e-kube-api-access-wd52g\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.119343 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.119257 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.119343 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.119315 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.219715 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.219668 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.219829 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.219776 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd52g\" (UniqueName: \"kubernetes.io/projected/96df3282-e881-4237-9de5-7fd93364188e-kube-api-access-wd52g\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.219829 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.219809 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.220104 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.220082 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.220104 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.220096 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.229706 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.229680 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd52g\" (UniqueName: \"kubernetes.io/projected/96df3282-e881-4237-9de5-7fd93364188e-kube-api-access-wd52g\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.415115 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.415040 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:30.533542 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.533515 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp"] Apr 23 08:20:30.535655 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:20:30.535617 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96df3282_e881_4237_9de5_7fd93364188e.slice/crio-29fb583be91188f73dd151493c99f1950f04eeca8f4178d3294a6a0bc36b82fa WatchSource:0}: Error finding container 29fb583be91188f73dd151493c99f1950f04eeca8f4178d3294a6a0bc36b82fa: Status 404 returned error can't find the container with id 29fb583be91188f73dd151493c99f1950f04eeca8f4178d3294a6a0bc36b82fa Apr 23 08:20:30.814804 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.814772 2561 generic.go:358] "Generic (PLEG): container finished" podID="96df3282-e881-4237-9de5-7fd93364188e" containerID="b0e3c266a8ed6217fa9aed5a08b32de64a8918211ae34ac2484b39331301fde2" exitCode=0 Apr 23 08:20:30.814925 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.814853 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" event={"ID":"96df3282-e881-4237-9de5-7fd93364188e","Type":"ContainerDied","Data":"b0e3c266a8ed6217fa9aed5a08b32de64a8918211ae34ac2484b39331301fde2"} Apr 23 08:20:30.814925 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:30.814884 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" event={"ID":"96df3282-e881-4237-9de5-7fd93364188e","Type":"ContainerStarted","Data":"29fb583be91188f73dd151493c99f1950f04eeca8f4178d3294a6a0bc36b82fa"} Apr 23 08:20:39.846872 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:39.846845 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" event={"ID":"96df3282-e881-4237-9de5-7fd93364188e","Type":"ContainerStarted","Data":"592e0f035cc82c40ad444781bc92c7ae977ec9877e0043aaf65fdc1a4e5264da"} Apr 23 08:20:40.851338 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:40.851300 2561 generic.go:358] "Generic (PLEG): container finished" podID="96df3282-e881-4237-9de5-7fd93364188e" containerID="592e0f035cc82c40ad444781bc92c7ae977ec9877e0043aaf65fdc1a4e5264da" exitCode=0 Apr 23 08:20:40.851735 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:40.851391 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" event={"ID":"96df3282-e881-4237-9de5-7fd93364188e","Type":"ContainerDied","Data":"592e0f035cc82c40ad444781bc92c7ae977ec9877e0043aaf65fdc1a4e5264da"} Apr 23 08:20:41.856830 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:41.856797 2561 generic.go:358] "Generic (PLEG): container finished" podID="96df3282-e881-4237-9de5-7fd93364188e" containerID="2b455c2330855ad16bea7d92746df2aeff0c10e7018c974d113487a50fa4d404" exitCode=0 Apr 23 08:20:41.857219 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:41.856902 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" event={"ID":"96df3282-e881-4237-9de5-7fd93364188e","Type":"ContainerDied","Data":"2b455c2330855ad16bea7d92746df2aeff0c10e7018c974d113487a50fa4d404"} Apr 23 08:20:42.984520 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:42.984498 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:43.012565 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.012544 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-bundle\") pod \"96df3282-e881-4237-9de5-7fd93364188e\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " Apr 23 08:20:43.012675 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.012587 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd52g\" (UniqueName: \"kubernetes.io/projected/96df3282-e881-4237-9de5-7fd93364188e-kube-api-access-wd52g\") pod \"96df3282-e881-4237-9de5-7fd93364188e\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " Apr 23 08:20:43.012675 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.012640 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-util\") pod \"96df3282-e881-4237-9de5-7fd93364188e\" (UID: \"96df3282-e881-4237-9de5-7fd93364188e\") " Apr 23 08:20:43.012979 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.012948 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-bundle" (OuterVolumeSpecName: "bundle") pod "96df3282-e881-4237-9de5-7fd93364188e" (UID: "96df3282-e881-4237-9de5-7fd93364188e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:20:43.014587 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.014565 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96df3282-e881-4237-9de5-7fd93364188e-kube-api-access-wd52g" (OuterVolumeSpecName: "kube-api-access-wd52g") pod "96df3282-e881-4237-9de5-7fd93364188e" (UID: "96df3282-e881-4237-9de5-7fd93364188e"). InnerVolumeSpecName "kube-api-access-wd52g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:20:43.020056 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.020030 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-util" (OuterVolumeSpecName: "util") pod "96df3282-e881-4237-9de5-7fd93364188e" (UID: "96df3282-e881-4237-9de5-7fd93364188e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:20:43.113417 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.113344 2561 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-util\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:20:43.113417 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.113369 2561 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96df3282-e881-4237-9de5-7fd93364188e-bundle\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:20:43.113417 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.113382 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wd52g\" (UniqueName: \"kubernetes.io/projected/96df3282-e881-4237-9de5-7fd93364188e-kube-api-access-wd52g\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:20:43.865306 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.865239 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" event={"ID":"96df3282-e881-4237-9de5-7fd93364188e","Type":"ContainerDied","Data":"29fb583be91188f73dd151493c99f1950f04eeca8f4178d3294a6a0bc36b82fa"} Apr 23 08:20:43.865306 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.865298 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evm7dp" Apr 23 08:20:43.865521 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:43.865302 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29fb583be91188f73dd151493c99f1950f04eeca8f4178d3294a6a0bc36b82fa" Apr 23 08:20:59.174686 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.174653 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t"] Apr 23 08:20:59.175077 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.175000 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96df3282-e881-4237-9de5-7fd93364188e" containerName="pull" Apr 23 08:20:59.175077 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.175010 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="96df3282-e881-4237-9de5-7fd93364188e" containerName="pull" Apr 23 08:20:59.175077 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.175020 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96df3282-e881-4237-9de5-7fd93364188e" containerName="util" Apr 23 08:20:59.175077 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.175024 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="96df3282-e881-4237-9de5-7fd93364188e" containerName="util" Apr 23 08:20:59.175077 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.175032 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96df3282-e881-4237-9de5-7fd93364188e" containerName="extract" Apr 23 08:20:59.175077 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.175037 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="96df3282-e881-4237-9de5-7fd93364188e" containerName="extract" Apr 23 08:20:59.175303 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.175089 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="96df3282-e881-4237-9de5-7fd93364188e" containerName="extract" Apr 23 08:20:59.178039 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.178023 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.181868 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.181833 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 23 08:20:59.181868 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.181840 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-j4jdr\"" Apr 23 08:20:59.182064 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.181841 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 23 08:20:59.182064 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.181891 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:20:59.182064 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.181888 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 23 08:20:59.182064 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.181885 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 08:20:59.188121 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.188099 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t"] Apr 23 08:20:59.241246 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.241203 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37b79d11-c7a5-4914-8fc3-ffd0818da189-metrics-certs\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.241386 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.241286 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/37b79d11-c7a5-4914-8fc3-ffd0818da189-manager-config\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.241386 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.241359 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfqk2\" (UniqueName: \"kubernetes.io/projected/37b79d11-c7a5-4914-8fc3-ffd0818da189-kube-api-access-nfqk2\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.241483 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.241400 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37b79d11-c7a5-4914-8fc3-ffd0818da189-cert\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.342050 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.342027 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37b79d11-c7a5-4914-8fc3-ffd0818da189-metrics-certs\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.342187 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.342075 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/37b79d11-c7a5-4914-8fc3-ffd0818da189-manager-config\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.342187 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.342099 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfqk2\" (UniqueName: \"kubernetes.io/projected/37b79d11-c7a5-4914-8fc3-ffd0818da189-kube-api-access-nfqk2\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.342187 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.342127 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37b79d11-c7a5-4914-8fc3-ffd0818da189-cert\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.342668 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.342636 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/37b79d11-c7a5-4914-8fc3-ffd0818da189-manager-config\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.344500 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.344480 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37b79d11-c7a5-4914-8fc3-ffd0818da189-metrics-certs\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.344598 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.344545 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37b79d11-c7a5-4914-8fc3-ffd0818da189-cert\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.355213 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.355190 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfqk2\" (UniqueName: \"kubernetes.io/projected/37b79d11-c7a5-4914-8fc3-ffd0818da189-kube-api-access-nfqk2\") pod \"jobset-controller-manager-7759cfb7f9-khn4t\" (UID: \"37b79d11-c7a5-4914-8fc3-ffd0818da189\") " pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.488207 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.488176 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:20:59.620472 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.620446 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t"] Apr 23 08:20:59.622645 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:20:59.622622 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b79d11_c7a5_4914_8fc3_ffd0818da189.slice/crio-3b96f34a8b5b33dd6a8272b02dad21aad4f6adc1a99cb4ed5b1fec36cc678963 WatchSource:0}: Error finding container 3b96f34a8b5b33dd6a8272b02dad21aad4f6adc1a99cb4ed5b1fec36cc678963: Status 404 returned error can't find the container with id 3b96f34a8b5b33dd6a8272b02dad21aad4f6adc1a99cb4ed5b1fec36cc678963 Apr 23 08:20:59.923700 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:20:59.923616 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" event={"ID":"37b79d11-c7a5-4914-8fc3-ffd0818da189","Type":"ContainerStarted","Data":"3b96f34a8b5b33dd6a8272b02dad21aad4f6adc1a99cb4ed5b1fec36cc678963"} Apr 23 08:21:02.936698 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:21:02.936658 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" event={"ID":"37b79d11-c7a5-4914-8fc3-ffd0818da189","Type":"ContainerStarted","Data":"6ab95ac1eade454e143d0917db94451b1555a73e22653fb3a0e2f61bc7daf1cb"} Apr 23 08:21:02.937129 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:21:02.936784 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:21:02.956789 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:21:02.956738 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" podStartSLOduration=1.182842534 podStartE2EDuration="3.95672445s" podCreationTimestamp="2026-04-23 08:20:59 +0000 UTC" firstStartedPulling="2026-04-23 08:20:59.624320278 +0000 UTC m=+381.555683060" lastFinishedPulling="2026-04-23 08:21:02.398202183 +0000 UTC m=+384.329564976" observedRunningTime="2026-04-23 08:21:02.954106059 +0000 UTC m=+384.885468871" watchObservedRunningTime="2026-04-23 08:21:02.95672445 +0000 UTC m=+384.888087251" Apr 23 08:21:13.945230 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:21:13.945199 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-7759cfb7f9-khn4t" Apr 23 08:23:04.375103 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.375072 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d75f864d5-2lhtk"] Apr 23 08:23:04.378475 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.378455 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.388674 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.388646 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d75f864d5-2lhtk"] Apr 23 08:23:04.465286 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.465230 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-console-config\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.465448 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.465359 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-service-ca\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.465448 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.465416 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acac8d88-445f-4195-bd24-42a58556c615-console-oauth-config\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.465448 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.465437 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-trusted-ca-bundle\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.465553 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.465452 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-oauth-serving-cert\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.465553 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.465478 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhwl6\" (UniqueName: \"kubernetes.io/projected/acac8d88-445f-4195-bd24-42a58556c615-kube-api-access-jhwl6\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.465614 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.465558 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acac8d88-445f-4195-bd24-42a58556c615-console-serving-cert\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.566421 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.566382 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-service-ca\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.566607 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.566434 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acac8d88-445f-4195-bd24-42a58556c615-console-oauth-config\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.566607 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.566478 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-trusted-ca-bundle\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.566607 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.566501 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-oauth-serving-cert\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.566607 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.566535 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhwl6\" (UniqueName: \"kubernetes.io/projected/acac8d88-445f-4195-bd24-42a58556c615-kube-api-access-jhwl6\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.566815 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.566658 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acac8d88-445f-4195-bd24-42a58556c615-console-serving-cert\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.566815 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.566712 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-console-config\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.567338 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.567303 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-service-ca\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.567466 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.567354 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-oauth-serving-cert\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.567532 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.567506 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-console-config\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.567595 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.567566 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acac8d88-445f-4195-bd24-42a58556c615-trusted-ca-bundle\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.568916 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.568895 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acac8d88-445f-4195-bd24-42a58556c615-console-oauth-config\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.569022 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.569004 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acac8d88-445f-4195-bd24-42a58556c615-console-serving-cert\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.576207 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.576189 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhwl6\" (UniqueName: \"kubernetes.io/projected/acac8d88-445f-4195-bd24-42a58556c615-kube-api-access-jhwl6\") pod \"console-5d75f864d5-2lhtk\" (UID: \"acac8d88-445f-4195-bd24-42a58556c615\") " pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.689693 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.689607 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:04.808945 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:04.808924 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d75f864d5-2lhtk"] Apr 23 08:23:04.810951 ip-10-0-134-8 kubenswrapper[2561]: W0423 08:23:04.810921 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacac8d88_445f_4195_bd24_42a58556c615.slice/crio-859a5a4305b74e5604976a448dc8026da2922c9ec59ef744dcc446e56b3c951e WatchSource:0}: Error finding container 859a5a4305b74e5604976a448dc8026da2922c9ec59ef744dcc446e56b3c951e: Status 404 returned error can't find the container with id 859a5a4305b74e5604976a448dc8026da2922c9ec59ef744dcc446e56b3c951e Apr 23 08:23:05.361596 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:05.361559 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d75f864d5-2lhtk" event={"ID":"acac8d88-445f-4195-bd24-42a58556c615","Type":"ContainerStarted","Data":"bc452858e325b91cb7d8c5f15d5260a4a87bb1abd9da98d1bed34d30a609eb0e"} Apr 23 08:23:05.361596 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:05.361599 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d75f864d5-2lhtk" event={"ID":"acac8d88-445f-4195-bd24-42a58556c615","Type":"ContainerStarted","Data":"859a5a4305b74e5604976a448dc8026da2922c9ec59ef744dcc446e56b3c951e"} Apr 23 08:23:05.380170 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:05.380130 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d75f864d5-2lhtk" podStartSLOduration=1.380116131 podStartE2EDuration="1.380116131s" podCreationTimestamp="2026-04-23 08:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:23:05.378972747 +0000 UTC m=+507.310335547" watchObservedRunningTime="2026-04-23 08:23:05.380116131 +0000 UTC m=+507.311478932" Apr 23 08:23:14.689806 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:14.689767 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:14.689806 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:14.689804 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:14.694218 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:14.694197 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:15.401703 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:15.401669 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d75f864d5-2lhtk" Apr 23 08:23:15.447736 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:15.447706 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cf9f45756-jcjns"] Apr 23 08:23:40.467438 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.467333 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cf9f45756-jcjns" podUID="ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" containerName="console" containerID="cri-o://a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb" gracePeriod=15 Apr 23 08:23:40.703165 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.703145 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cf9f45756-jcjns_ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd/console/0.log" Apr 23 08:23:40.703288 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.703203 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:23:40.789686 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.789623 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-config\") pod \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " Apr 23 08:23:40.789686 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.789675 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-oauth-serving-cert\") pod \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " Apr 23 08:23:40.789854 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.789704 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-serving-cert\") pod \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " Apr 23 08:23:40.789854 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.789727 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-service-ca\") pod \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " Apr 23 08:23:40.789854 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.789770 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwftg\" (UniqueName: \"kubernetes.io/projected/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-kube-api-access-xwftg\") pod \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " Apr 23 08:23:40.790011 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.789843 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-trusted-ca-bundle\") pod \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " Apr 23 08:23:40.790011 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.789932 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-oauth-config\") pod \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\" (UID: \"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd\") " Apr 23 08:23:40.790102 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.790074 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-config" (OuterVolumeSpecName: "console-config") pod "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" (UID: "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:23:40.790228 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.790198 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-config\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:23:40.790228 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.790212 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-service-ca" (OuterVolumeSpecName: "service-ca") pod "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" (UID: "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:23:40.790228 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.790220 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" (UID: "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:23:40.790429 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.790306 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" (UID: "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:23:40.791971 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.791933 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" (UID: "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:23:40.792052 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.791970 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-kube-api-access-xwftg" (OuterVolumeSpecName: "kube-api-access-xwftg") pod "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" (UID: "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd"). InnerVolumeSpecName "kube-api-access-xwftg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:23:40.792052 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.792022 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" (UID: "ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:23:40.891174 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.891136 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-serving-cert\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:23:40.891174 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.891171 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-service-ca\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:23:40.891347 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.891186 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwftg\" (UniqueName: \"kubernetes.io/projected/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-kube-api-access-xwftg\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:23:40.891347 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.891199 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-trusted-ca-bundle\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:23:40.891347 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.891213 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-console-oauth-config\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:23:40.891347 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:40.891225 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd-oauth-serving-cert\") on node \"ip-10-0-134-8.ec2.internal\" DevicePath \"\"" Apr 23 08:23:41.494468 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.494441 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cf9f45756-jcjns_ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd/console/0.log" Apr 23 08:23:41.494877 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.494481 2561 generic.go:358] "Generic (PLEG): container finished" podID="ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" containerID="a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb" exitCode=2 Apr 23 08:23:41.494877 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.494544 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf9f45756-jcjns" event={"ID":"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd","Type":"ContainerDied","Data":"a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb"} Apr 23 08:23:41.494877 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.494564 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf9f45756-jcjns" event={"ID":"ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd","Type":"ContainerDied","Data":"c20b9f40f80b779fe021ea89d5f1fc210ab790f9875f75a53a02ef2b839690e4"} Apr 23 08:23:41.494877 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.494578 2561 scope.go:117] "RemoveContainer" containerID="a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb" Apr 23 08:23:41.494877 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.494542 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf9f45756-jcjns" Apr 23 08:23:41.503109 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.503092 2561 scope.go:117] "RemoveContainer" containerID="a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb" Apr 23 08:23:41.503376 ip-10-0-134-8 kubenswrapper[2561]: E0423 08:23:41.503355 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb\": container with ID starting with a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb not found: ID does not exist" containerID="a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb" Apr 23 08:23:41.503438 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.503385 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb"} err="failed to get container status \"a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb\": rpc error: code = NotFound desc = could not find container \"a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb\": container with ID starting with a0768a1c562937e4a8bd7d55c755948a4aa24d94391c802c7ed0ed7bc1c0d0eb not found: ID does not exist" Apr 23 08:23:41.515533 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.515510 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cf9f45756-jcjns"] Apr 23 08:23:41.521010 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:41.520991 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cf9f45756-jcjns"] Apr 23 08:23:42.664463 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:23:42.664431 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" path="/var/lib/kubelet/pods/ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd/volumes" Apr 23 08:24:38.572404 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:24:38.572377 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:24:38.572886 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:24:38.572377 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:29:38.599659 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:29:38.599576 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:29:38.601027 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:29:38.601005 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:34:38.631568 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:34:38.631535 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:34:38.634189 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:34:38.632869 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:39:38.658355 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:39:38.658237 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:39:38.662331 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:39:38.660179 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:44:38.689995 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:44:38.689874 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:44:38.695837 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:44:38.694399 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:49:38.714608 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:49:38.714583 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:49:38.720572 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:49:38.720550 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:54:38.740033 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:54:38.739935 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:54:38.745936 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:54:38.745912 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:59:38.765419 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:59:38.765307 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 08:59:38.772286 ip-10-0-134-8 kubenswrapper[2561]: I0423 08:59:38.772249 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 09:03:10.551373 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:10.551341 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-t4t69_d49c3330-f379-49e7-92d4-30be09c51dd6/global-pull-secret-syncer/0.log" Apr 23 09:03:10.615826 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:10.615791 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mht6n_6b438d86-63e2-4e66-a166-475de69c7900/konnectivity-agent/0.log" Apr 23 09:03:10.807007 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:10.806875 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-8.ec2.internal_4a0c7941d424e31aadfd07308d6e5c7b/haproxy/0.log" Apr 23 09:03:14.143564 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.143524 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e7d4824-41fb-4a7a-959b-dc8a61e71187/alertmanager/0.log" Apr 23 09:03:14.190380 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.190349 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e7d4824-41fb-4a7a-959b-dc8a61e71187/config-reloader/0.log" Apr 23 09:03:14.236528 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.236498 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e7d4824-41fb-4a7a-959b-dc8a61e71187/kube-rbac-proxy-web/0.log" Apr 23 09:03:14.299199 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.299172 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e7d4824-41fb-4a7a-959b-dc8a61e71187/kube-rbac-proxy/0.log" Apr 23 09:03:14.359239 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.359213 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e7d4824-41fb-4a7a-959b-dc8a61e71187/kube-rbac-proxy-metric/0.log" Apr 23 09:03:14.401043 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.400973 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e7d4824-41fb-4a7a-959b-dc8a61e71187/prom-label-proxy/0.log" Apr 23 09:03:14.442979 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.442956 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e7d4824-41fb-4a7a-959b-dc8a61e71187/init-config-reloader/0.log" Apr 23 09:03:14.517368 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.517344 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-cbh4v_2c1dd227-3279-4f30-b918-473a6a080619/cluster-monitoring-operator/0.log" Apr 23 09:03:14.834983 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.834958 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nl62j_5b7a07bf-2318-478f-9149-ee5a0395ef3f/node-exporter/0.log" Apr 23 09:03:14.861441 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.861414 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nl62j_5b7a07bf-2318-478f-9149-ee5a0395ef3f/kube-rbac-proxy/0.log" Apr 23 09:03:14.889938 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:14.889918 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nl62j_5b7a07bf-2318-478f-9149-ee5a0395ef3f/init-textfile/0.log" Apr 23 09:03:15.550023 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.549947 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-ksft5_26e34e7b-74c5-44ab-a606-d279a8dc3619/prometheus-operator-admission-webhook/0.log" Apr 23 09:03:15.586729 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.586700 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-84c6475f8d-fqb5d_4db818a7-277d-4827-bfd7-b70afd3dbbe4/telemeter-client/0.log" Apr 23 09:03:15.611320 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.611290 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-84c6475f8d-fqb5d_4db818a7-277d-4827-bfd7-b70afd3dbbe4/reload/0.log" Apr 23 09:03:15.635485 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.635462 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-84c6475f8d-fqb5d_4db818a7-277d-4827-bfd7-b70afd3dbbe4/kube-rbac-proxy/0.log" Apr 23 09:03:15.670743 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.670718 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c655cf7d5-k7g7j_5effe1b4-68bd-4d42-b808-4141bd7e5df5/thanos-query/0.log" Apr 23 09:03:15.697144 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.697116 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c655cf7d5-k7g7j_5effe1b4-68bd-4d42-b808-4141bd7e5df5/kube-rbac-proxy-web/0.log" Apr 23 09:03:15.724280 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.724246 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c655cf7d5-k7g7j_5effe1b4-68bd-4d42-b808-4141bd7e5df5/kube-rbac-proxy/0.log" Apr 23 09:03:15.751503 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.751471 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c655cf7d5-k7g7j_5effe1b4-68bd-4d42-b808-4141bd7e5df5/prom-label-proxy/0.log" Apr 23 09:03:15.784494 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.784462 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c655cf7d5-k7g7j_5effe1b4-68bd-4d42-b808-4141bd7e5df5/kube-rbac-proxy-rules/0.log" Apr 23 09:03:15.813243 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:15.813167 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c655cf7d5-k7g7j_5effe1b4-68bd-4d42-b808-4141bd7e5df5/kube-rbac-proxy-metrics/0.log" Apr 23 09:03:17.194149 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.194122 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/1.log" Apr 23 09:03:17.201389 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.201355 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94fwt_d460c450-63cc-49ec-af6a-6618277ea5cf/console-operator/2.log" Apr 23 09:03:17.567126 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.567070 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d75f864d5-2lhtk_acac8d88-445f-4195-bd24-42a58556c615/console/0.log" Apr 23 09:03:17.804183 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.804152 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8"] Apr 23 09:03:17.804548 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.804536 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" containerName="console" Apr 23 09:03:17.804592 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.804550 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" containerName="console" Apr 23 09:03:17.804630 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.804609 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef3b1a8d-f3a4-42e7-bda5-a50593cac0fd" containerName="console" Apr 23 09:03:17.807669 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.807653 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:17.810716 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.810696 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjn8\"/\"openshift-service-ca.crt\"" Apr 23 09:03:17.810792 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.810754 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjn8\"/\"kube-root-ca.crt\"" Apr 23 09:03:17.811954 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.811938 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmjn8\"/\"default-dockercfg-pjq8w\"" Apr 23 09:03:17.817767 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.817722 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8"] Apr 23 09:03:17.952195 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.952156 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-sys\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:17.952387 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.952213 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-proc\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:17.952387 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.952238 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-lib-modules\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:17.952387 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.952281 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-podres\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:17.952387 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:17.952311 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttkmn\" (UniqueName: \"kubernetes.io/projected/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-kube-api-access-ttkmn\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.043357 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.043329 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-jmnkj_e488b11b-324c-4989-b28f-8aaa6ecd0cab/volume-data-source-validator/0.log" Apr 23 09:03:18.052946 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.052912 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-proc\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.052946 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.052942 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-lib-modules\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.053139 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.052961 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-podres\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.053139 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.053020 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttkmn\" (UniqueName: \"kubernetes.io/projected/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-kube-api-access-ttkmn\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.053139 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.053075 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-lib-modules\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.053139 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.053028 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-proc\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.053139 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.053108 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-podres\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.053139 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.053138 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-sys\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.053452 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.053163 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-sys\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.061119 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.061090 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttkmn\" (UniqueName: \"kubernetes.io/projected/0e9b84dd-0b1d-4310-b5a9-7589f58332b7-kube-api-access-ttkmn\") pod \"perf-node-gather-daemonset-wctb8\" (UID: \"0e9b84dd-0b1d-4310-b5a9-7589f58332b7\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.118172 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.118089 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.241138 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.241110 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8"] Apr 23 09:03:18.243472 ip-10-0-134-8 kubenswrapper[2561]: W0423 09:03:18.243445 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e9b84dd_0b1d_4310_b5a9_7589f58332b7.slice/crio-288ace7ca584a62c2ab8e66d00bac957d60edf6dda9e32e08c224f85b2035387 WatchSource:0}: Error finding container 288ace7ca584a62c2ab8e66d00bac957d60edf6dda9e32e08c224f85b2035387: Status 404 returned error can't find the container with id 288ace7ca584a62c2ab8e66d00bac957d60edf6dda9e32e08c224f85b2035387 Apr 23 09:03:18.245109 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.245085 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:03:18.533856 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.533819 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" event={"ID":"0e9b84dd-0b1d-4310-b5a9-7589f58332b7","Type":"ContainerStarted","Data":"0b23c36b152c021953ac3f11ea801477f26eb939bc35a4d3020b17a3b31a497d"} Apr 23 09:03:18.533856 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.533856 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" event={"ID":"0e9b84dd-0b1d-4310-b5a9-7589f58332b7","Type":"ContainerStarted","Data":"288ace7ca584a62c2ab8e66d00bac957d60edf6dda9e32e08c224f85b2035387"} Apr 23 09:03:18.534065 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.533943 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:18.555959 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.555908 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" podStartSLOduration=1.555894693 podStartE2EDuration="1.555894693s" podCreationTimestamp="2026-04-23 09:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:03:18.554558369 +0000 UTC m=+2920.485921169" watchObservedRunningTime="2026-04-23 09:03:18.555894693 +0000 UTC m=+2920.487257493" Apr 23 09:03:18.767441 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.767411 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-46nvx_3ada2676-04c4-4126-a943-cd1d167949aa/dns/0.log" Apr 23 09:03:18.812141 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.812051 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-46nvx_3ada2676-04c4-4126-a943-cd1d167949aa/kube-rbac-proxy/0.log" Apr 23 09:03:18.992140 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:18.992109 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vdfxl_556cc9f0-a576-455e-b539-83577cba025c/dns-node-resolver/0.log" Apr 23 09:03:19.461751 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:19.461720 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-86wvz_3f5d8347-124b-469f-8ac6-0c963d6c4634/node-ca/0.log" Apr 23 09:03:20.640985 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:20.640951 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ttph8_c7c0ad21-b2af-4a80-a79c-000cff3a91ab/serve-healthcheck-canary/0.log" Apr 23 09:03:21.048720 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:21.048664 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-km78h_2cdf9e60-6a76-44c7-a819-39654b29c96a/insights-operator/0.log" Apr 23 09:03:21.049828 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:21.049807 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-km78h_2cdf9e60-6a76-44c7-a819-39654b29c96a/insights-operator/1.log" Apr 23 09:03:21.158606 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:21.158581 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvqs5_34496dd2-18a1-4fe2-a3be-b2d24e4bd928/kube-rbac-proxy/0.log" Apr 23 09:03:21.184091 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:21.184065 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvqs5_34496dd2-18a1-4fe2-a3be-b2d24e4bd928/exporter/0.log" Apr 23 09:03:21.213520 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:21.213495 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvqs5_34496dd2-18a1-4fe2-a3be-b2d24e4bd928/extractor/0.log" Apr 23 09:03:22.899980 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:22.899942 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-7759cfb7f9-khn4t_37b79d11-c7a5-4914-8fc3-ffd0818da189/manager/0.log" Apr 23 09:03:24.548221 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:24.548194 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-wctb8" Apr 23 09:03:26.253824 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:26.253787 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-pjbfn_7097634a-9704-4d5b-a292-4c6376bde24a/migrator/0.log" Apr 23 09:03:26.277732 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:26.277703 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-pjbfn_7097634a-9704-4d5b-a292-4c6376bde24a/graceful-termination/0.log" Apr 23 09:03:26.658172 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:26.658083 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6qc5q_a42ca4f9-a9ae-4413-9c3d-fa18098d565a/kube-storage-version-migrator-operator/1.log" Apr 23 09:03:26.659962 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:26.659934 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6qc5q_a42ca4f9-a9ae-4413-9c3d-fa18098d565a/kube-storage-version-migrator-operator/0.log" Apr 23 09:03:27.650438 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:27.650406 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg44l_0a8488f0-d2d8-4107-b542-5f46729c4927/kube-multus-additional-cni-plugins/0.log" Apr 23 09:03:27.685710 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:27.685689 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg44l_0a8488f0-d2d8-4107-b542-5f46729c4927/egress-router-binary-copy/0.log" Apr 23 09:03:27.711070 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:27.711045 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg44l_0a8488f0-d2d8-4107-b542-5f46729c4927/cni-plugins/0.log" Apr 23 09:03:27.738755 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:27.738726 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg44l_0a8488f0-d2d8-4107-b542-5f46729c4927/bond-cni-plugin/0.log" Apr 23 09:03:27.766651 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:27.766592 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg44l_0a8488f0-d2d8-4107-b542-5f46729c4927/routeoverride-cni/0.log" Apr 23 09:03:27.791472 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:27.791450 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg44l_0a8488f0-d2d8-4107-b542-5f46729c4927/whereabouts-cni-bincopy/0.log" Apr 23 09:03:27.818877 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:27.818857 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg44l_0a8488f0-d2d8-4107-b542-5f46729c4927/whereabouts-cni/0.log" Apr 23 09:03:28.432008 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:28.431974 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jgxcr_892bfeb4-76ad-49cf-b615-dfa772b87a7e/kube-multus/0.log" Apr 23 09:03:28.565062 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:28.565023 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pmv55_e92a791e-42ac-4855-b7b5-945f53108891/network-metrics-daemon/0.log" Apr 23 09:03:28.588307 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:28.588279 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pmv55_e92a791e-42ac-4855-b7b5-945f53108891/kube-rbac-proxy/0.log" Apr 23 09:03:29.366887 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:29.366854 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5wkc_05731c48-9bfe-46ed-8390-b6d811272383/ovn-controller/0.log" Apr 23 09:03:29.413900 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:29.413871 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5wkc_05731c48-9bfe-46ed-8390-b6d811272383/ovn-acl-logging/0.log" Apr 23 09:03:29.446292 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:29.446243 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5wkc_05731c48-9bfe-46ed-8390-b6d811272383/kube-rbac-proxy-node/0.log" Apr 23 09:03:29.470859 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:29.470833 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5wkc_05731c48-9bfe-46ed-8390-b6d811272383/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:03:29.490942 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:29.490910 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5wkc_05731c48-9bfe-46ed-8390-b6d811272383/northd/0.log" Apr 23 09:03:29.515672 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:29.515649 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5wkc_05731c48-9bfe-46ed-8390-b6d811272383/nbdb/0.log" Apr 23 09:03:29.542988 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:29.542939 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5wkc_05731c48-9bfe-46ed-8390-b6d811272383/sbdb/0.log" Apr 23 09:03:29.705348 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:29.705297 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5wkc_05731c48-9bfe-46ed-8390-b6d811272383/ovnkube-controller/0.log" Apr 23 09:03:31.571148 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:31.571115 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-cqjnh_fc7bcc2c-0662-49d5-846d-e6a5358d369a/check-endpoints/0.log" Apr 23 09:03:31.631520 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:31.631491 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mbfqt_59f9a0a5-064a-4dd4-9790-0bff108c8fbe/network-check-target-container/0.log" Apr 23 09:03:32.614230 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:32.614202 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-l76bb_02a74ef7-7607-44c8-9c82-00f2c73ba0e8/iptables-alerter/0.log" Apr 23 09:03:33.362668 ip-10-0-134-8 kubenswrapper[2561]: I0423 09:03:33.362587 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zngnf_2b615e73-dc52-4885-94d7-dc4fecd877f6/tuned/0.log"