Mar 12 13:35:41.405724 ip-10-0-139-20 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Mar 12 13:35:41.405737 ip-10-0-139-20 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Mar 12 13:35:41.405744 ip-10-0-139-20 systemd[1]: kubelet.service: Failed with result 'resources'. Mar 12 13:35:41.406046 ip-10-0-139-20 systemd[1]: Failed to start Kubernetes Kubelet. Mar 12 13:35:51.525507 ip-10-0-139-20 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Mar 12 13:35:51.525520 ip-10-0-139-20 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2dc5e224115a49c38e40c671399cdd79 -- Mar 12 13:37:45.792950 ip-10-0-139-20 systemd[1]: Starting Kubernetes Kubelet... Mar 12 13:37:46.267014 ip-10-0-139-20 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:37:46.267014 ip-10-0-139-20 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 13:37:46.267014 ip-10-0-139-20 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:37:46.267014 ip-10-0-139-20 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 13:37:46.267014 ip-10-0-139-20 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:37:46.269599 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.269514 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 13:37:46.272243 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272228 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:46.272243 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272244 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272248 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272251 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272254 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272257 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272260 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272263 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272267 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272270 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272273 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272275 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272278 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272280 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272283 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272285 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272288 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272290 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272293 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272295 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272298 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:46.272306 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272301 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272303 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272306 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272308 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272311 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272314 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272317 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272320 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272322 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272325 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272327 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272330 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272333 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272335 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272338 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272340 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272343 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272345 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272349 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272351 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:46.272786 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272354 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272356 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272359 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272361 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272364 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272366 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272369 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272371 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272374 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272377 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272379 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272382 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272386 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272389 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272393 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272396 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272399 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272402 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272405 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:46.273269 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272407 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272410 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272412 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272415 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272418 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272420 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272423 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272426 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272430 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272435 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272437 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272440 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272442 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272444 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272447 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272449 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272451 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272454 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272457 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:46.273763 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272459 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272461 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272464 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272466 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272476 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272478 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.272481 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273466 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273473 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273477 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273480 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273483 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273485 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273488 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273490 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273494 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273497 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273499 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273502 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273505 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:46.274220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273508 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273510 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273513 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273515 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273518 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273520 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273523 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273525 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273527 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273530 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273532 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273534 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273537 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273540 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273542 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273545 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273547 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273549 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273552 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273555 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:46.274724 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273558 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273560 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273563 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273565 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273568 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273572 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273576 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273579 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273581 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273584 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273586 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273589 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273592 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273594 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273597 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273599 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273601 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273604 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273606 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:46.275220 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273609 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273611 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273614 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273616 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273619 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273621 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273623 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273626 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273628 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273631 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273633 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273635 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273638 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273641 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273644 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273659 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273662 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273664 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273667 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:46.275712 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273669 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273675 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273679 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273682 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273685 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273688 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273691 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273694 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273697 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273700 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273702 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273705 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273708 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273710 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.273712 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273784 2575 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273797 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273804 2575 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273808 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273813 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 13:37:46.276174 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273816 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273821 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273825 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273828 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273831 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273835 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273839 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273842 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273845 2575 flags.go:64] FLAG: --cgroup-root="" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273848 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273851 2575 flags.go:64] FLAG: --client-ca-file="" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273854 2575 flags.go:64] FLAG: --cloud-config="" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273857 2575 flags.go:64] FLAG: --cloud-provider="external" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273860 2575 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273865 2575 flags.go:64] FLAG: --cluster-domain="" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273868 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273871 2575 flags.go:64] FLAG: --config-dir="" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273874 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273878 2575 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273881 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273885 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273888 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273891 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273894 2575 flags.go:64] FLAG: --contention-profiling="false" Mar 12 13:37:46.276686 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273897 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273900 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273904 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273906 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273910 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273913 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273916 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273919 2575 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273922 2575 flags.go:64] FLAG: --enable-server="true" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273925 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273929 2575 flags.go:64] FLAG: --event-burst="100" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273932 2575 flags.go:64] FLAG: --event-qps="50" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273935 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273938 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273941 2575 flags.go:64] FLAG: --eviction-hard="" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273945 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273948 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273951 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273954 2575 flags.go:64] FLAG: --eviction-soft="" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273957 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273960 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273965 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273968 2575 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273971 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273974 2575 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 13:37:46.277290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273977 2575 flags.go:64] FLAG: --feature-gates="" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273981 2575 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273984 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273987 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273991 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273994 2575 flags.go:64] FLAG: --healthz-port="10248" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.273997 2575 flags.go:64] FLAG: --help="false" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274000 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274003 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274006 2575 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274009 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274012 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274016 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274019 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274021 2575 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274024 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274027 2575 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274030 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274033 2575 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274036 2575 flags.go:64] FLAG: --kube-reserved="" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274038 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274041 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274044 2575 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274048 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 13:37:46.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274051 2575 flags.go:64] FLAG: --lock-file="" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274053 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274056 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274059 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274066 2575 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274068 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274071 2575 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274074 2575 flags.go:64] FLAG: --logging-format="text" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274077 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274080 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274083 2575 flags.go:64] FLAG: --manifest-url="" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274086 2575 flags.go:64] FLAG: --manifest-url-header="" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274090 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274094 2575 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274098 2575 flags.go:64] FLAG: --max-pods="110" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274100 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274103 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274106 2575 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274109 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274112 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274114 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274117 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274124 2575 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274128 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274131 2575 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 13:37:46.278611 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274133 2575 flags.go:64] FLAG: --pod-cidr="" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274136 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274142 2575 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274144 2575 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274147 2575 flags.go:64] FLAG: --pods-per-core="0" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274150 2575 flags.go:64] FLAG: --port="10250" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274156 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274159 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-083464c0637db7a62" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274162 2575 flags.go:64] FLAG: --qos-reserved="" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274165 2575 flags.go:64] FLAG: --read-only-port="10255" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274168 2575 flags.go:64] FLAG: --register-node="true" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274172 2575 flags.go:64] FLAG: --register-schedulable="true" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274175 2575 flags.go:64] FLAG: --register-with-taints="" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274179 2575 flags.go:64] FLAG: --registry-burst="10" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274181 2575 flags.go:64] FLAG: --registry-qps="5" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274184 2575 flags.go:64] FLAG: --reserved-cpus="" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274187 2575 flags.go:64] FLAG: --reserved-memory="" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274191 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274194 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274197 2575 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274200 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274203 2575 flags.go:64] FLAG: --runonce="false" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274206 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274209 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274211 2575 flags.go:64] FLAG: --seccomp-default="false" Mar 12 13:37:46.279229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274214 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274217 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274220 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274223 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274227 2575 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274230 2575 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274233 2575 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274235 2575 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274238 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274241 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274244 2575 flags.go:64] FLAG: --system-cgroups="" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274247 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274252 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274256 2575 flags.go:64] FLAG: --tls-cert-file="" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274258 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274262 2575 flags.go:64] FLAG: --tls-min-version="" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274265 2575 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274268 2575 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274272 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274275 2575 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274277 2575 flags.go:64] FLAG: --v="2" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274281 2575 flags.go:64] FLAG: --version="false" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274286 2575 flags.go:64] FLAG: --vmodule="" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274290 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.274293 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 13:37:46.279893 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274390 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274394 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274397 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274399 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274403 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274405 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274408 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274410 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274413 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274415 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274418 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274420 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274423 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274425 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274427 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274430 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274432 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274435 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274438 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274440 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:46.280515 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274444 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274447 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274449 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274452 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274454 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274458 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274461 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274463 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274465 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274468 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274470 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274473 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274475 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274478 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274481 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274483 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274485 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274488 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274491 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:46.281057 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274495 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274498 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274501 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274504 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274506 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274509 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274511 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274515 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274518 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274521 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274524 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274526 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274529 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274532 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274534 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274537 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274540 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274542 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274546 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:46.281525 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274548 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274551 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274554 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274556 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274558 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274561 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274563 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274565 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274568 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274570 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274573 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274575 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274578 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274581 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274583 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274585 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274588 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274591 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274593 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274595 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:46.282099 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274598 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274600 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274603 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274605 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274607 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274610 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274613 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.274615 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.275335 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.281855 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.281871 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281922 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281928 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281931 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281934 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281937 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:46.282582 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281940 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281942 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281945 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281948 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281951 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281953 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281956 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281958 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281967 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281970 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281972 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281975 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281977 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281980 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281983 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281985 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281988 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281992 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.281996 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282002 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:46.283006 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282005 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282008 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282010 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282013 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282015 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282018 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282021 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282024 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282026 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282028 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282031 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282033 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282036 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282039 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282041 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282043 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282046 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282048 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282051 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282053 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:46.283531 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282055 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282059 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282061 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282064 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282066 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282069 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282071 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282073 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282076 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282079 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282081 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282084 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282087 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282089 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282092 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282094 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282096 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282099 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282101 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282104 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:46.284048 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282106 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282109 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282112 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282115 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282117 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282119 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282122 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282124 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282127 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282129 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282132 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282134 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282136 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282139 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282146 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282149 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282152 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282155 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282158 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282160 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:46.284535 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282164 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.282169 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282267 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282272 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282275 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282278 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282280 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282283 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282286 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282289 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282291 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282294 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282296 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282298 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282301 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282303 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:46.285053 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282306 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282308 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282311 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282313 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282316 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282318 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282321 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282323 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282326 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282328 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282331 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282334 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282336 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282339 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282342 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282344 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282347 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282349 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282352 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282354 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:46.285453 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282357 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282359 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282362 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282364 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282367 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282369 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282372 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282374 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282376 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282379 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282382 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282385 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282388 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282391 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282394 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282397 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282399 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282401 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282404 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:46.285956 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282406 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282409 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282411 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282414 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282417 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282420 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282423 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282426 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282428 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282431 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282434 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282436 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282439 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282441 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282443 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282446 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282448 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282451 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282453 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:46.286428 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282456 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282458 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282460 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282463 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282465 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282467 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282470 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282472 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282475 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282477 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282479 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282482 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282484 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:46.282486 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.282491 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 13:37:46.286906 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.283184 2575 server.go:962] "Client rotation is on, will bootstrap in background" Mar 12 13:37:46.287286 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.285054 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 12 13:37:46.287286 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.286022 2575 server.go:1019] "Starting client certificate rotation" Mar 12 13:37:46.287286 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.286146 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 12 13:37:46.287286 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.286181 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 12 13:37:46.313662 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.313633 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 13:37:46.319603 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.319572 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 13:37:46.336801 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.336778 2575 log.go:25] "Validated CRI v1 runtime API" Mar 12 13:37:46.342439 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.342419 2575 log.go:25] "Validated CRI v1 image API" Mar 12 13:37:46.345700 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.345673 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 13:37:46.345893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.345876 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 12 13:37:46.350367 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.350338 2575 fs.go:135] Filesystem UUIDs: map[1e869cbc-44e3-492f-bf55-1dcec72fc657:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b0ebda66-44ba-484a-ac0c-28db9166d016:/dev/nvme0n1p3] Mar 12 13:37:46.350443 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.350366 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 12 13:37:46.357086 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.356971 2575 manager.go:217] Machine: {Timestamp:2026-03-12 13:37:46.354702578 +0000 UTC m=+0.444226664 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104421 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e4ddd72c6fc790b37feb3037218ab SystemUUID:ec2e4ddd-72c6-fc79-0b37-feb3037218ab BootID:2dc5e224-115a-49c3-8e40-c671399cdd79 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:87:33:49:03:c5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:87:33:49:03:c5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:53:d8:e7:b4:e4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 13:37:46.357086 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.357074 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 13:37:46.357248 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.357172 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 13:37:46.358361 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.358333 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 13:37:46.358536 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.358363 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-20.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 13:37:46.358613 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.358550 2575 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 13:37:46.358613 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.358564 2575 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 13:37:46.358613 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.358582 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 13:37:46.359665 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.359635 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 13:37:46.361644 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.361631 2575 state_mem.go:36] "Initialized new in-memory state store" Mar 12 13:37:46.361788 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.361777 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 12 13:37:46.365098 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.364990 2575 kubelet.go:491] "Attempting to sync node with API server" Mar 12 13:37:46.365098 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.365015 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 13:37:46.365098 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.365032 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 13:37:46.365098 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.365048 2575 kubelet.go:397] "Adding apiserver pod source" Mar 12 13:37:46.365098 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.365095 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 13:37:46.366219 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.366206 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 12 13:37:46.366292 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.366230 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 12 13:37:46.369556 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.369538 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 12 13:37:46.370067 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.370051 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lfblj" Mar 12 13:37:46.371716 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.371701 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 13:37:46.374461 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374444 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 13:37:46.374522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374466 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 13:37:46.374522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374480 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 13:37:46.374522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374488 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 13:37:46.374522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374498 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 13:37:46.374522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374506 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 13:37:46.374522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374514 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 13:37:46.374522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374523 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 13:37:46.374735 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374534 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 13:37:46.374735 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374543 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 13:37:46.374735 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374567 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 13:37:46.374735 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.374580 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 13:37:46.377887 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.377866 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 13:37:46.378097 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.378083 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 12 13:37:46.378097 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.378088 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lfblj" Mar 12 13:37:46.380257 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.380231 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-20.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 13:37:46.380483 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.380452 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 13:37:46.382525 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.382508 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 12 13:37:46.382604 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.382564 2575 server.go:1295] "Started kubelet" Mar 12 13:37:46.382738 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.382711 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 13:37:46.382738 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.382707 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 13:37:46.382874 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.382767 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 12 13:37:46.383328 ip-10-0-139-20 systemd[1]: Started Kubernetes Kubelet. Mar 12 13:37:46.384418 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.384397 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 13:37:46.385796 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.385783 2575 server.go:317] "Adding debug handlers to kubelet server" Mar 12 13:37:46.393241 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.393221 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 12 13:37:46.393466 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.393436 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 12 13:37:46.394229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.394215 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 13:37:46.395049 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.394922 2575 factory.go:55] Registering systemd factory Mar 12 13:37:46.395117 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395092 2575 factory.go:223] Registration of the systemd container factory successfully Mar 12 13:37:46.395117 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.395095 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:46.395117 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.394943 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 12 13:37:46.395229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395121 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 12 13:37:46.395229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.394945 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 12 13:37:46.395229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395177 2575 reconstruct.go:97] "Volume reconstruction finished" Mar 12 13:37:46.395229 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395187 2575 reconciler.go:26] "Reconciler: start to sync state" Mar 12 13:37:46.395358 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395311 2575 factory.go:153] Registering CRI-O factory Mar 12 13:37:46.395358 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395321 2575 factory.go:223] Registration of the crio container factory successfully Mar 12 13:37:46.395423 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395396 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 13:37:46.395423 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395414 2575 factory.go:103] Registering Raw factory Mar 12 13:37:46.395423 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.395423 2575 manager.go:1196] Started watching for new ooms in manager Mar 12 13:37:46.396577 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.396561 2575 manager.go:319] Starting recovery of all containers Mar 12 13:37:46.396822 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.396795 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:46.398432 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.398414 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-20.ec2.internal" not found Mar 12 13:37:46.400007 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.399979 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-20.ec2.internal\" not found" node="ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.406316 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.406302 2575 manager.go:324] Recovery completed Mar 12 13:37:46.411153 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.411139 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:46.414263 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.414247 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:46.414362 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.414281 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:46.414362 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.414295 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:46.414758 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.414747 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 12 13:37:46.414798 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.414759 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 12 13:37:46.414798 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.414779 2575 state_mem.go:36] "Initialized new in-memory state store" Mar 12 13:37:46.414898 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.414879 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-20.ec2.internal" not found Mar 12 13:37:46.416880 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.416867 2575 policy_none.go:49] "None policy: Start" Mar 12 13:37:46.416935 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.416884 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 12 13:37:46.416935 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.416895 2575 state_mem.go:35] "Initializing new in-memory state store" Mar 12 13:37:46.452196 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.452177 2575 manager.go:341] "Starting Device Plugin manager" Mar 12 13:37:46.452269 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.452214 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 13:37:46.452269 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.452225 2575 server.go:85] "Starting device plugin registration server" Mar 12 13:37:46.452458 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.452441 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 13:37:46.452555 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.452459 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 13:37:46.452604 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.452595 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 13:37:46.452695 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.452682 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 13:37:46.452695 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.452694 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 13:37:46.453261 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.453243 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 12 13:37:46.453345 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.453283 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:46.469109 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.469082 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 12 13:37:46.484632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.470437 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 12 13:37:46.484632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.470458 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 12 13:37:46.484632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.470476 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 13:37:46.484632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.470484 2575 kubelet.go:2451] "Starting kubelet main sync loop" Mar 12 13:37:46.484632 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.470518 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 12 13:37:46.484632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.473735 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:46.484632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.475276 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-20.ec2.internal" not found Mar 12 13:37:46.553058 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.553005 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:46.553952 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.553939 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:46.554000 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.553967 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:46.554000 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.553978 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:46.554072 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.554001 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.566333 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.566315 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.566407 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.566337 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-20.ec2.internal\": node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:46.571432 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.571402 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal"] Mar 12 13:37:46.571497 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.571479 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:46.572337 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.572318 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:46.572432 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.572351 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:46.572432 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.572365 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:46.574733 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.574718 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:46.574829 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.574810 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.574877 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.574852 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:46.575604 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.575587 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:46.575715 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.575616 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:46.575715 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.575637 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:46.575715 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.575685 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:46.575715 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.575707 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:46.575715 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.575717 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:46.577796 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.577780 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.577876 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.577806 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:46.579465 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.579447 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:46.579561 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.579480 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:46.579561 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.579494 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:46.590502 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.590485 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:46.596693 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.596666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc0a321c7bdfae40265690c7f19683de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal\" (UID: \"fc0a321c7bdfae40265690c7f19683de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.596763 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.596700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc0a321c7bdfae40265690c7f19683de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal\" (UID: \"fc0a321c7bdfae40265690c7f19683de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.596763 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.596722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/31268ff111c3b6a3e6157c28dc36e73e-config\") pod \"kube-apiserver-proxy-ip-10-0-139-20.ec2.internal\" (UID: \"31268ff111c3b6a3e6157c28dc36e73e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.601467 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.601445 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-20.ec2.internal\" not found" node="ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.605854 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.605837 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-20.ec2.internal\" not found" node="ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.690579 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.690561 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:46.696884 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.696864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc0a321c7bdfae40265690c7f19683de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal\" (UID: \"fc0a321c7bdfae40265690c7f19683de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.696980 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.696890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc0a321c7bdfae40265690c7f19683de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal\" (UID: \"fc0a321c7bdfae40265690c7f19683de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.696980 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.696914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/31268ff111c3b6a3e6157c28dc36e73e-config\") pod \"kube-apiserver-proxy-ip-10-0-139-20.ec2.internal\" (UID: \"31268ff111c3b6a3e6157c28dc36e73e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.696980 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.696969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc0a321c7bdfae40265690c7f19683de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal\" (UID: \"fc0a321c7bdfae40265690c7f19683de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.697078 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.696978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc0a321c7bdfae40265690c7f19683de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal\" (UID: \"fc0a321c7bdfae40265690c7f19683de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.697078 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.697014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/31268ff111c3b6a3e6157c28dc36e73e-config\") pod \"kube-apiserver-proxy-ip-10-0-139-20.ec2.internal\" (UID: \"31268ff111c3b6a3e6157c28dc36e73e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.791027 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.791006 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:46.891426 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.891373 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:46.904781 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.904760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.908446 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:46.908434 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" Mar 12 13:37:46.991709 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:46.991674 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:47.092249 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:47.092214 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:47.192765 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:47.192697 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:47.286246 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.286213 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 12 13:37:47.286843 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.286346 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 12 13:37:47.286843 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.286389 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 12 13:37:47.293352 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:47.293335 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:47.382314 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.382274 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-11 13:32:46 +0000 UTC" deadline="2027-10-19 12:26:55.325028306 +0000 UTC" Mar 12 13:37:47.382314 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.382310 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14062h49m7.942721125s" Mar 12 13:37:47.394432 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:47.394406 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-20.ec2.internal\" not found" Mar 12 13:37:47.394538 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.394434 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 12 13:37:47.431307 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.431280 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 12 13:37:47.439439 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.439419 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:47.468601 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:47.468572 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31268ff111c3b6a3e6157c28dc36e73e.slice/crio-7e245a92aa6863afe2884e3e5ffb184be804ea23dab65d95e09f0e233400cc71 WatchSource:0}: Error finding container 7e245a92aa6863afe2884e3e5ffb184be804ea23dab65d95e09f0e233400cc71: Status 404 returned error can't find the container with id 7e245a92aa6863afe2884e3e5ffb184be804ea23dab65d95e09f0e233400cc71 Mar 12 13:37:47.474415 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.474403 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:37:47.495109 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.495084 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" Mar 12 13:37:47.520363 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.520340 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 13:37:47.522456 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.522444 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" Mar 12 13:37:47.528642 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.528624 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jf7vb" Mar 12 13:37:47.537641 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.537620 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 13:37:47.549682 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.549662 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jf7vb" Mar 12 13:37:47.874826 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:47.874794 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0a321c7bdfae40265690c7f19683de.slice/crio-49c76a46043f26db166e4570753388be9717fed80232b152159cab90b54fad77 WatchSource:0}: Error finding container 49c76a46043f26db166e4570753388be9717fed80232b152159cab90b54fad77: Status 404 returned error can't find the container with id 49c76a46043f26db166e4570753388be9717fed80232b152159cab90b54fad77 Mar 12 13:37:47.971003 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:47.970977 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:48.150795 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.150716 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:48.366233 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.366207 2575 apiserver.go:52] "Watching apiserver" Mar 12 13:37:48.375803 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.375769 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:48.376753 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.376722 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 12 13:37:48.377097 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.377062 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal","openshift-cluster-node-tuning-operator/tuned-qjdg7","openshift-image-registry/node-ca-g64nq","openshift-multus/multus-2jt74","openshift-multus/multus-additional-cni-plugins-qq8v5","openshift-network-diagnostics/network-check-target-mms2n","openshift-network-operator/iptables-alerter-4rzbn","openshift-ovn-kubernetes/ovnkube-node-plcmr","kube-system/konnectivity-agent-w2psw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal","openshift-multus/network-metrics-daemon-qwv64"] Mar 12 13:37:48.382935 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.382912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.385039 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.385015 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.385151 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.385101 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.387164 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.387145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.388241 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.388222 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 12 13:37:48.388469 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.388445 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wq45j\"" Mar 12 13:37:48.389598 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.389577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:48.389718 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:48.389665 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:37:48.390886 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.390863 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:37:48.391817 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.391802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.393354 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.393329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8w8rg\"" Mar 12 13:37:48.393568 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.393547 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 12 13:37:48.393681 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.393619 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 12 13:37:48.393782 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.393765 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 12 13:37:48.393843 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.393804 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 12 13:37:48.394096 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.394080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.395041 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.394783 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 12 13:37:48.395041 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.394797 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 12 13:37:48.395041 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.394827 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 12 13:37:48.395041 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.394846 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4s45x\"" Mar 12 13:37:48.395041 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.394854 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 12 13:37:48.395041 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.394790 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2xpcw\"" Mar 12 13:37:48.395297 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.395116 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:37:48.395297 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.395222 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 12 13:37:48.396434 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.396418 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:37:48.396633 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.396613 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 12 13:37:48.397155 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.397136 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zxwpr\"" Mar 12 13:37:48.397550 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.397530 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 12 13:37:48.397670 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.397599 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s9hkx\"" Mar 12 13:37:48.398467 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.398449 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 12 13:37:48.398550 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.398481 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 12 13:37:48.398550 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.398508 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 12 13:37:48.398759 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.398744 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.399937 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.399917 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 12 13:37:48.400037 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.399981 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 12 13:37:48.400326 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.400311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-z88dg\"" Mar 12 13:37:48.400509 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.400494 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 12 13:37:48.400717 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.400702 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 12 13:37:48.401303 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.401264 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 12 13:37:48.401667 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.401633 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 12 13:37:48.401894 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.401879 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jq4m\"" Mar 12 13:37:48.401969 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.401882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:48.402033 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:48.402012 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:37:48.402084 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.402024 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 12 13:37:48.402125 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.402100 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-cni-bin\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-ovnkube-config\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-system-cni-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-hostroot\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sw4m\" (UniqueName: \"kubernetes.io/projected/c07aa00c-e596-44da-b75d-f3772a7057fd-kube-api-access-7sw4m\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-etc-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-run-ovn-kubernetes\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405455 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-var-lib-kubelet\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-socket-dir-parent\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-conf-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf9rp\" (UniqueName: \"kubernetes.io/projected/fc8195c5-3667-46e7-8bca-1b80b2d9943d-kube-api-access-bf9rp\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-os-release\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-cni-binary-copy\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.405857 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-kubelet\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-modprobe-d\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-sys\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc8195c5-3667-46e7-8bca-1b80b2d9943d-serviceca\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysctl-conf\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-k8s-cni-cncf-io\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405942 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-netns\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-os-release\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.405987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-slash\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysctl-d\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-systemd-units\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-cni-netd\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-lib-modules\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.406764 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-tuned\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-host-slash\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-systemd\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4djll\" (UniqueName: \"kubernetes.io/projected/4a7d4073-afc5-478a-8838-a78fa193f1bd-kube-api-access-4djll\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-cnibin\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-cni-bin\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc8195c5-3667-46e7-8bca-1b80b2d9943d-host\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-cni-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-cni-multus\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-daemon-config\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-cnibin\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-ovn\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6js9q\" (UniqueName: \"kubernetes.io/projected/24428774-0c1d-4253-a9b0-384ed1b79796-kube-api-access-6js9q\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-kubernetes\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-var-lib-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.407313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rthhl\" (UniqueName: \"kubernetes.io/projected/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-kube-api-access-rthhl\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-iptables-alerter-script\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-run-netns\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406782 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-log-socket\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-ovnkube-script-lib\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysconfig\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-kubelet\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-system-cni-dir\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqkpp\" (UniqueName: \"kubernetes.io/projected/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-kube-api-access-rqkpp\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4e3e43f3-29b4-45df-8953-0095e22a0d55-konnectivity-ca\") pod \"konnectivity-agent-w2psw\" (UID: \"4e3e43f3-29b4-45df-8953-0095e22a0d55\") " pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-run\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-host\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-systemd\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.408052 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.406997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-cni-binary-copy\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.409025 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.407321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24428774-0c1d-4253-a9b0-384ed1b79796-ovn-node-metrics-cert\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.409122 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.409059 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4e3e43f3-29b4-45df-8953-0095e22a0d55-agent-certs\") pod \"konnectivity-agent-w2psw\" (UID: \"4e3e43f3-29b4-45df-8953-0095e22a0d55\") " pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:37:48.409122 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.409093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-multus-certs\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.409215 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.409125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-node-log\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.409215 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.409151 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-env-overrides\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.409215 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.409181 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a7d4073-afc5-478a-8838-a78fa193f1bd-tmp\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.409215 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.409210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-etc-kubernetes\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.476168 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.476115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" event={"ID":"fc0a321c7bdfae40265690c7f19683de","Type":"ContainerStarted","Data":"49c76a46043f26db166e4570753388be9717fed80232b152159cab90b54fad77"} Mar 12 13:37:48.477613 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.477583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" event={"ID":"31268ff111c3b6a3e6157c28dc36e73e","Type":"ContainerStarted","Data":"7e245a92aa6863afe2884e3e5ffb184be804ea23dab65d95e09f0e233400cc71"} Mar 12 13:37:48.496262 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.496238 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 12 13:37:48.509966 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.509931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-lib-modules\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.510093 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.509980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-tuned\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.510093 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.510093 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-host-slash\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.510093 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-systemd\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.510093 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4djll\" (UniqueName: \"kubernetes.io/projected/4a7d4073-afc5-478a-8838-a78fa193f1bd-kube-api-access-4djll\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.510093 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-cnibin\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-cni-bin\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc8195c5-3667-46e7-8bca-1b80b2d9943d-host\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-sys-fs\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-cni-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-cni-multus\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-daemon-config\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-cnibin\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-ovn\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6js9q\" (UniqueName: \"kubernetes.io/projected/24428774-0c1d-4253-a9b0-384ed1b79796-kube-api-access-6js9q\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-kubernetes\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-var-lib-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.510330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rthhl\" (UniqueName: \"kubernetes.io/projected/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-kube-api-access-rthhl\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-iptables-alerter-script\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-run-netns\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-log-socket\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-ovnkube-script-lib\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysconfig\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-kubelet\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-system-cni-dir\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqkpp\" (UniqueName: \"kubernetes.io/projected/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-kube-api-access-rqkpp\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4e3e43f3-29b4-45df-8953-0095e22a0d55-konnectivity-ca\") pod \"konnectivity-agent-w2psw\" (UID: \"4e3e43f3-29b4-45df-8953-0095e22a0d55\") " pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-run\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-host\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510563 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqnjb\" (UniqueName: \"kubernetes.io/projected/e076d25a-0359-40a3-8294-d82580c2252e-kube-api-access-pqnjb\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-systemd\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-cni-binary-copy\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24428774-0c1d-4253-a9b0-384ed1b79796-ovn-node-metrics-cert\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-socket-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.510800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4e3e43f3-29b4-45df-8953-0095e22a0d55-agent-certs\") pod \"konnectivity-agent-w2psw\" (UID: \"4e3e43f3-29b4-45df-8953-0095e22a0d55\") " pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-multus-certs\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-node-log\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-env-overrides\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a7d4073-afc5-478a-8838-a78fa193f1bd-tmp\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-etc-kubernetes\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-cni-bin\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-ovnkube-config\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-system-cni-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-hostroot\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sw4m\" (UniqueName: \"kubernetes.io/projected/c07aa00c-e596-44da-b75d-f3772a7057fd-kube-api-access-7sw4m\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-etc-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-run-ovn-kubernetes\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.510953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-var-lib-kubelet\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-socket-dir-parent\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-conf-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511330 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bf9rp\" (UniqueName: \"kubernetes.io/projected/fc8195c5-3667-46e7-8bca-1b80b2d9943d-kube-api-access-bf9rp\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-registration-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511145 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8r5p\" (UniqueName: \"kubernetes.io/projected/0e6933a6-a221-45a8-ae79-110f7f192c33-kube-api-access-r8r5p\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-os-release\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-cni-binary-copy\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-kubelet\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-device-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-modprobe-d\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-sys\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc8195c5-3667-46e7-8bca-1b80b2d9943d-serviceca\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysctl-conf\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-k8s-cni-cncf-io\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-netns\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511407 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-os-release\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.511896 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-slash\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysctl-d\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-systemd-units\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-cni-netd\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-etc-selinux\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-lib-modules\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-run-netns\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.511801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-log-socket\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512066 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-iptables-alerter-script\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.512399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-ovnkube-script-lib\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.512964 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysconfig\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.512964 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-kubelet\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.512964 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-system-cni-dir\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.512964 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512640 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-env-overrides\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.512964 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-cni-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.512964 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.512757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-cni-multus\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.513387 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-daemon-config\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.513387 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-cnibin\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.513387 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-ovn\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.513512 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-kubernetes\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.513512 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-var-lib-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.513679 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4e3e43f3-29b4-45df-8953-0095e22a0d55-konnectivity-ca\") pod \"konnectivity-agent-w2psw\" (UID: \"4e3e43f3-29b4-45df-8953-0095e22a0d55\") " pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:37:48.513745 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-run\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.513745 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-run-ovn-kubernetes\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.513850 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-host\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.513850 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-cni-bin\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.513850 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.513832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-systemd\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-var-lib-kubelet\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-kubelet\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24428774-0c1d-4253-a9b0-384ed1b79796-ovnkube-config\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-socket-dir-parent\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-multus-conf-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-cni-binary-copy\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514283 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-system-cni-dir\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-hostroot\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-modprobe-d\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-sys\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-cni-binary-copy\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.514933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-os-release\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515045 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc8195c5-3667-46e7-8bca-1b80b2d9943d-serviceca\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysctl-conf\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-k8s-cni-cncf-io\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.515685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-netns\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-systemd-units\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-os-release\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515240 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-slash\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-sysctl-d\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-host-cni-netd\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-var-lib-cni-bin\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515419 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-cnibin\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc8195c5-3667-46e7-8bca-1b80b2d9943d-host\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-node-log\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-host-run-multus-certs\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a7d4073-afc5-478a-8838-a78fa193f1bd-tmp\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-host-slash\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-etc-openvswitch\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24428774-0c1d-4253-a9b0-384ed1b79796-run-systemd\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-etc-kubernetes\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4a7d4073-afc5-478a-8838-a78fa193f1bd-etc-tuned\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.516325 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.515770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c07aa00c-e596-44da-b75d-f3772a7057fd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.516950 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.516030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c07aa00c-e596-44da-b75d-f3772a7057fd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.518224 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.518198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4e3e43f3-29b4-45df-8953-0095e22a0d55-agent-certs\") pod \"konnectivity-agent-w2psw\" (UID: \"4e3e43f3-29b4-45df-8953-0095e22a0d55\") " pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:37:48.518407 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.518384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24428774-0c1d-4253-a9b0-384ed1b79796-ovn-node-metrics-cert\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.527421 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:48.527402 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:48.527421 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:48.527422 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:48.527581 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:48.527431 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4xbfm for pod openshift-network-diagnostics/network-check-target-mms2n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:48.527581 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:48.527524 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm podName:018363d6-b28d-4856-9451-fcf1632349aa nodeName:}" failed. No retries permitted until 2026-03-12 13:37:49.027484809 +0000 UTC m=+3.117008886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4xbfm" (UniqueName: "kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm") pod "network-check-target-mms2n" (UID: "018363d6-b28d-4856-9451-fcf1632349aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:48.530437 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.530390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rthhl\" (UniqueName: \"kubernetes.io/projected/43d2e0f6-060c-4389-9a1f-5bdb06198e7b-kube-api-access-rthhl\") pod \"multus-2jt74\" (UID: \"43d2e0f6-060c-4389-9a1f-5bdb06198e7b\") " pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.532816 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.532794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4djll\" (UniqueName: \"kubernetes.io/projected/4a7d4073-afc5-478a-8838-a78fa193f1bd-kube-api-access-4djll\") pod \"tuned-qjdg7\" (UID: \"4a7d4073-afc5-478a-8838-a78fa193f1bd\") " pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.532816 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.532805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sw4m\" (UniqueName: \"kubernetes.io/projected/c07aa00c-e596-44da-b75d-f3772a7057fd-kube-api-access-7sw4m\") pod \"multus-additional-cni-plugins-qq8v5\" (UID: \"c07aa00c-e596-44da-b75d-f3772a7057fd\") " pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.533178 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.533156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqkpp\" (UniqueName: \"kubernetes.io/projected/7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6-kube-api-access-rqkpp\") pod \"iptables-alerter-4rzbn\" (UID: \"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6\") " pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.534694 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.534671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf9rp\" (UniqueName: \"kubernetes.io/projected/fc8195c5-3667-46e7-8bca-1b80b2d9943d-kube-api-access-bf9rp\") pod \"node-ca-g64nq\" (UID: \"fc8195c5-3667-46e7-8bca-1b80b2d9943d\") " pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.536100 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.536077 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6js9q\" (UniqueName: \"kubernetes.io/projected/24428774-0c1d-4253-a9b0-384ed1b79796-kube-api-access-6js9q\") pod \"ovnkube-node-plcmr\" (UID: \"24428774-0c1d-4253-a9b0-384ed1b79796\") " pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.551233 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.551202 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-11 13:32:47 +0000 UTC" deadline="2027-09-04 16:36:38.008626576 +0000 UTC" Mar 12 13:37:48.551233 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.551233 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12986h58m49.457397308s" Mar 12 13:37:48.612049 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:48.612245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-etc-selinux\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-sys-fs\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqnjb\" (UniqueName: \"kubernetes.io/projected/e076d25a-0359-40a3-8294-d82580c2252e-kube-api-access-pqnjb\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:48.612245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-socket-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-registration-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8r5p\" (UniqueName: \"kubernetes.io/projected/0e6933a6-a221-45a8-ae79-110f7f192c33-kube-api-access-r8r5p\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-device-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612598 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-device-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612598 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:48.612359 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:48.612598 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:48.612402 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs podName:e076d25a-0359-40a3-8294-d82580c2252e nodeName:}" failed. No retries permitted until 2026-03-12 13:37:49.112388761 +0000 UTC m=+3.201912823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs") pod "network-metrics-daemon-qwv64" (UID: "e076d25a-0359-40a3-8294-d82580c2252e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:48.612598 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612598 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-etc-selinux\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612598 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-sys-fs\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612907 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-socket-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.612907 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.612836 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e6933a6-a221-45a8-ae79-110f7f192c33-registration-dir\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.626798 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.626757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqnjb\" (UniqueName: \"kubernetes.io/projected/e076d25a-0359-40a3-8294-d82580c2252e-kube-api-access-pqnjb\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:48.627931 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.627907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8r5p\" (UniqueName: \"kubernetes.io/projected/0e6933a6-a221-45a8-ae79-110f7f192c33-kube-api-access-r8r5p\") pod \"aws-ebs-csi-driver-node-257gt\" (UID: \"0e6933a6-a221-45a8-ae79-110f7f192c33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:48.693559 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.693483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" Mar 12 13:37:48.701908 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.701879 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g64nq" Mar 12 13:37:48.702880 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:48.702848 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a7d4073_afc5_478a_8838_a78fa193f1bd.slice/crio-5b56247a2f0bad2edd7bf72cc4aa0a177570fda7f70b60fe3bea6efd158d6de7 WatchSource:0}: Error finding container 5b56247a2f0bad2edd7bf72cc4aa0a177570fda7f70b60fe3bea6efd158d6de7: Status 404 returned error can't find the container with id 5b56247a2f0bad2edd7bf72cc4aa0a177570fda7f70b60fe3bea6efd158d6de7 Mar 12 13:37:48.711018 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:48.710995 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc8195c5_3667_46e7_8bca_1b80b2d9943d.slice/crio-cfeac34c527727385ce1346e593c0e2ccb90cfeb3a1b49a2429f65f4d06f5862 WatchSource:0}: Error finding container cfeac34c527727385ce1346e593c0e2ccb90cfeb3a1b49a2429f65f4d06f5862: Status 404 returned error can't find the container with id cfeac34c527727385ce1346e593c0e2ccb90cfeb3a1b49a2429f65f4d06f5862 Mar 12 13:37:48.713108 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.713088 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2jt74" Mar 12 13:37:48.715585 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.715566 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" Mar 12 13:37:48.720688 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:48.720665 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43d2e0f6_060c_4389_9a1f_5bdb06198e7b.slice/crio-de9bf4aa4ca29d0d4034b57d7ffc4dd852d4e010bd8b447444c4823d1183516a WatchSource:0}: Error finding container de9bf4aa4ca29d0d4034b57d7ffc4dd852d4e010bd8b447444c4823d1183516a: Status 404 returned error can't find the container with id de9bf4aa4ca29d0d4034b57d7ffc4dd852d4e010bd8b447444c4823d1183516a Mar 12 13:37:48.722831 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.722794 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4rzbn" Mar 12 13:37:48.733593 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.733572 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:37:48.740203 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.740183 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:37:48.745759 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:48.745739 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" Mar 12 13:37:49.116096 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:49.116059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:49.116269 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:49.116110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:49.116269 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:49.116198 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:49.116269 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:49.116216 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:49.116269 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:49.116219 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:49.116269 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:49.116237 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4xbfm for pod openshift-network-diagnostics/network-check-target-mms2n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:49.116453 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:49.116283 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs podName:e076d25a-0359-40a3-8294-d82580c2252e nodeName:}" failed. No retries permitted until 2026-03-12 13:37:50.116262522 +0000 UTC m=+4.205786589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs") pod "network-metrics-daemon-qwv64" (UID: "e076d25a-0359-40a3-8294-d82580c2252e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:49.116453 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:49.116301 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm podName:018363d6-b28d-4856-9451-fcf1632349aa nodeName:}" failed. No retries permitted until 2026-03-12 13:37:50.116293858 +0000 UTC m=+4.205817925 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4xbfm" (UniqueName: "kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm") pod "network-check-target-mms2n" (UID: "018363d6-b28d-4856-9451-fcf1632349aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:49.480300 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:49.480197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jt74" event={"ID":"43d2e0f6-060c-4389-9a1f-5bdb06198e7b","Type":"ContainerStarted","Data":"de9bf4aa4ca29d0d4034b57d7ffc4dd852d4e010bd8b447444c4823d1183516a"} Mar 12 13:37:49.481198 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:49.481158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g64nq" event={"ID":"fc8195c5-3667-46e7-8bca-1b80b2d9943d","Type":"ContainerStarted","Data":"cfeac34c527727385ce1346e593c0e2ccb90cfeb3a1b49a2429f65f4d06f5862"} Mar 12 13:37:49.482181 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:49.482156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" event={"ID":"4a7d4073-afc5-478a-8838-a78fa193f1bd","Type":"ContainerStarted","Data":"5b56247a2f0bad2edd7bf72cc4aa0a177570fda7f70b60fe3bea6efd158d6de7"} Mar 12 13:37:49.551814 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:49.551770 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-11 13:32:47 +0000 UTC" deadline="2027-08-20 02:19:36.221300717 +0000 UTC" Mar 12 13:37:49.551814 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:49.551805 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12612h41m46.669500665s" Mar 12 13:37:49.554977 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:49.554944 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc07aa00c_e596_44da_b75d_f3772a7057fd.slice/crio-ac3328e5bbe81fd5010a01944b3f93f613ef2b067b45d22487a1b53b68ce0914 WatchSource:0}: Error finding container ac3328e5bbe81fd5010a01944b3f93f613ef2b067b45d22487a1b53b68ce0914: Status 404 returned error can't find the container with id ac3328e5bbe81fd5010a01944b3f93f613ef2b067b45d22487a1b53b68ce0914 Mar 12 13:37:49.611704 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:49.611643 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3e43f3_29b4_45df_8953_0095e22a0d55.slice/crio-a4d0d1e90a8a4c52224bb7a523887af2b1d10e1f83bc28841f80152fac5f206b WatchSource:0}: Error finding container a4d0d1e90a8a4c52224bb7a523887af2b1d10e1f83bc28841f80152fac5f206b: Status 404 returned error can't find the container with id a4d0d1e90a8a4c52224bb7a523887af2b1d10e1f83bc28841f80152fac5f206b Mar 12 13:37:49.613120 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:49.613094 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24428774_0c1d_4253_a9b0_384ed1b79796.slice/crio-6794c8bf60b6a1fc2adc00bfd1ae606396c717eb2221d58a35df83095b2ebcd1 WatchSource:0}: Error finding container 6794c8bf60b6a1fc2adc00bfd1ae606396c717eb2221d58a35df83095b2ebcd1: Status 404 returned error can't find the container with id 6794c8bf60b6a1fc2adc00bfd1ae606396c717eb2221d58a35df83095b2ebcd1 Mar 12 13:37:49.613841 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:49.613809 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e6933a6_a221_45a8_ae79_110f7f192c33.slice/crio-a90af952c94d78d7163d66199d9377b705944acbcdfe1e267f9f669faf09631b WatchSource:0}: Error finding container a90af952c94d78d7163d66199d9377b705944acbcdfe1e267f9f669faf09631b: Status 404 returned error can't find the container with id a90af952c94d78d7163d66199d9377b705944acbcdfe1e267f9f669faf09631b Mar 12 13:37:49.614999 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:37:49.614976 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7812c8ef_b633_4d0e_bdb1_683d0a4b9dd6.slice/crio-84d62918ce34a05be70569c804eddb770830bfec310d1db9a5fa8023211554ff WatchSource:0}: Error finding container 84d62918ce34a05be70569c804eddb770830bfec310d1db9a5fa8023211554ff: Status 404 returned error can't find the container with id 84d62918ce34a05be70569c804eddb770830bfec310d1db9a5fa8023211554ff Mar 12 13:37:50.123054 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.122824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:50.123255 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.123082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:50.123255 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:50.123123 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:50.123255 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:50.123152 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:50.123255 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:50.123167 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4xbfm for pod openshift-network-diagnostics/network-check-target-mms2n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:50.123255 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:50.123219 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:50.123255 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:50.123227 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm podName:018363d6-b28d-4856-9451-fcf1632349aa nodeName:}" failed. No retries permitted until 2026-03-12 13:37:52.123208382 +0000 UTC m=+6.212732456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4xbfm" (UniqueName: "kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm") pod "network-check-target-mms2n" (UID: "018363d6-b28d-4856-9451-fcf1632349aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:50.123563 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:50.123264 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs podName:e076d25a-0359-40a3-8294-d82580c2252e nodeName:}" failed. No retries permitted until 2026-03-12 13:37:52.123252204 +0000 UTC m=+6.212776270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs") pod "network-metrics-daemon-qwv64" (UID: "e076d25a-0359-40a3-8294-d82580c2252e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:50.474346 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.473623 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:50.474346 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:50.473765 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:37:50.474346 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.474173 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:50.474346 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:50.474271 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:37:50.490062 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.490012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4rzbn" event={"ID":"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6","Type":"ContainerStarted","Data":"84d62918ce34a05be70569c804eddb770830bfec310d1db9a5fa8023211554ff"} Mar 12 13:37:50.493148 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.493108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"6794c8bf60b6a1fc2adc00bfd1ae606396c717eb2221d58a35df83095b2ebcd1"} Mar 12 13:37:50.498402 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.498376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerStarted","Data":"ac3328e5bbe81fd5010a01944b3f93f613ef2b067b45d22487a1b53b68ce0914"} Mar 12 13:37:50.502592 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.502007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" event={"ID":"31268ff111c3b6a3e6157c28dc36e73e","Type":"ContainerStarted","Data":"cc928c6debff38bacf193e12f85b9f701a51fb81f406decabd54e0eaaf521a19"} Mar 12 13:37:50.508525 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.508477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" event={"ID":"0e6933a6-a221-45a8-ae79-110f7f192c33","Type":"ContainerStarted","Data":"a90af952c94d78d7163d66199d9377b705944acbcdfe1e267f9f669faf09631b"} Mar 12 13:37:50.511120 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:50.511080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w2psw" event={"ID":"4e3e43f3-29b4-45df-8953-0095e22a0d55","Type":"ContainerStarted","Data":"a4d0d1e90a8a4c52224bb7a523887af2b1d10e1f83bc28841f80152fac5f206b"} Mar 12 13:37:51.522220 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:51.522170 2575 generic.go:358] "Generic (PLEG): container finished" podID="fc0a321c7bdfae40265690c7f19683de" containerID="e723f67e3863247b9e2c860c914b16dc0de35d371ec928cfe39b59b10a4ddbab" exitCode=0 Mar 12 13:37:51.522739 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:51.522709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" event={"ID":"fc0a321c7bdfae40265690c7f19683de","Type":"ContainerDied","Data":"e723f67e3863247b9e2c860c914b16dc0de35d371ec928cfe39b59b10a4ddbab"} Mar 12 13:37:51.540166 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:51.540109 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-20.ec2.internal" podStartSLOduration=4.540092299 podStartE2EDuration="4.540092299s" podCreationTimestamp="2026-03-12 13:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:37:50.520522777 +0000 UTC m=+4.610046859" watchObservedRunningTime="2026-03-12 13:37:51.540092299 +0000 UTC m=+5.629616386" Mar 12 13:37:52.141579 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:52.141540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:52.141769 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:52.141600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:52.141829 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:52.141775 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:52.141880 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:52.141836 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs podName:e076d25a-0359-40a3-8294-d82580c2252e nodeName:}" failed. No retries permitted until 2026-03-12 13:37:56.14181787 +0000 UTC m=+10.231341947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs") pod "network-metrics-daemon-qwv64" (UID: "e076d25a-0359-40a3-8294-d82580c2252e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:52.142252 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:52.142229 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:52.142252 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:52.142254 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:52.142392 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:52.142268 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4xbfm for pod openshift-network-diagnostics/network-check-target-mms2n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:52.142392 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:52.142312 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm podName:018363d6-b28d-4856-9451-fcf1632349aa nodeName:}" failed. No retries permitted until 2026-03-12 13:37:56.142296183 +0000 UTC m=+10.231820259 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4xbfm" (UniqueName: "kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm") pod "network-check-target-mms2n" (UID: "018363d6-b28d-4856-9451-fcf1632349aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:52.471889 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:52.471815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:52.472053 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:52.471940 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:37:52.472282 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:52.472261 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:52.472395 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:52.472377 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:37:54.472812 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:54.472232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:54.472812 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:54.472357 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:37:54.472812 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:54.472686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:54.472812 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:54.472773 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:37:56.177302 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:56.177262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:56.177675 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:56.177328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:56.177675 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:56.177508 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:56.177675 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:56.177578 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs podName:e076d25a-0359-40a3-8294-d82580c2252e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:04.177558032 +0000 UTC m=+18.267082116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs") pod "network-metrics-daemon-qwv64" (UID: "e076d25a-0359-40a3-8294-d82580c2252e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:56.177934 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:56.177915 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:56.178006 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:56.177949 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:56.178006 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:56.177966 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4xbfm for pod openshift-network-diagnostics/network-check-target-mms2n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:56.178104 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:56.178028 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm podName:018363d6-b28d-4856-9451-fcf1632349aa nodeName:}" failed. No retries permitted until 2026-03-12 13:38:04.178002468 +0000 UTC m=+18.267526543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4xbfm" (UniqueName: "kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm") pod "network-check-target-mms2n" (UID: "018363d6-b28d-4856-9451-fcf1632349aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:56.472035 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:56.471960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:56.472196 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:56.472066 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:37:56.472435 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:56.472417 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:56.472686 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:56.472664 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:37:58.471627 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:58.471589 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:37:58.472033 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:37:58.471604 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:37:58.472033 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:58.471757 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:37:58.472118 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:37:58.472101 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:00.471759 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:00.471725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:00.471759 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:00.471743 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:00.472266 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:00.471847 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:00.472266 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:00.472007 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:02.471377 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:02.471335 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:02.471767 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:02.471389 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:02.471767 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:02.471492 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:02.471767 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:02.471600 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:04.232065 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:04.232030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:04.232491 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:04.232092 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:04.232491 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:04.232162 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:38:04.232491 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:04.232198 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:38:04.232491 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:04.232214 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:38:04.232491 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:04.232227 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4xbfm for pod openshift-network-diagnostics/network-check-target-mms2n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:38:04.232491 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:04.232245 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs podName:e076d25a-0359-40a3-8294-d82580c2252e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.232221799 +0000 UTC m=+34.321745867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs") pod "network-metrics-daemon-qwv64" (UID: "e076d25a-0359-40a3-8294-d82580c2252e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:38:04.232491 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:04.232266 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm podName:018363d6-b28d-4856-9451-fcf1632349aa nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.232255017 +0000 UTC m=+34.321779084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4xbfm" (UniqueName: "kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm") pod "network-check-target-mms2n" (UID: "018363d6-b28d-4856-9451-fcf1632349aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:38:04.471555 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:04.471515 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:04.471745 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:04.471695 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:04.471837 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:04.471755 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:04.471901 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:04.471884 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:06.474973 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:06.474943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:06.475296 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:06.475067 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:06.475468 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:06.475445 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:06.475574 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:06.475552 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:07.551760 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.551329 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" event={"ID":"fc0a321c7bdfae40265690c7f19683de","Type":"ContainerStarted","Data":"9aee8b13fdfc63d3d2a2a662b6a6ced0ff53fe20e488b570217ece3a700f7247"} Mar 12 13:38:07.553049 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.552951 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g64nq" event={"ID":"fc8195c5-3667-46e7-8bca-1b80b2d9943d","Type":"ContainerStarted","Data":"e2f5586d975edfc26b595dc1fa10e1687690124bc91824265f52db088d6259e0"} Mar 12 13:38:07.556512 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.556480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"1516db30a6e6c482be2a5f175f9039f50cfde5978b09844b66d6c7350c649e0d"} Mar 12 13:38:07.556512 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.556510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"057b7f125142f0ac2ba5fa613edefa295e3a6bbea721a7d0b22b2d3d79ecbde1"} Mar 12 13:38:07.556694 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.556524 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"3bc36e8dca0135aab4973bf1277fd7d06d77b4aa8f7863f67d6254181f6040a0"} Mar 12 13:38:07.556694 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.556534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"30038b429674e71fb2ec005b660b2d4b7ea50d033d0944dab6273f692ba9a4ad"} Mar 12 13:38:07.556694 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.556541 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"18978d0b4cf4af1ba6320180c55f23b45fdcd788cbfc25418a5eb05c6821ccc0"} Mar 12 13:38:07.556694 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.556550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"860dab6d40c530f8caf1babf8d84fbc08e9c9848ca137770bc099ad574bf10f9"} Mar 12 13:38:07.557792 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.557764 2575 generic.go:358] "Generic (PLEG): container finished" podID="c07aa00c-e596-44da-b75d-f3772a7057fd" containerID="6995efdefad1cd0706fbde28937560affddacf91a7119448bb8dcc42c5589b0a" exitCode=0 Mar 12 13:38:07.557881 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.557843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerDied","Data":"6995efdefad1cd0706fbde28937560affddacf91a7119448bb8dcc42c5589b0a"} Mar 12 13:38:07.559602 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.559548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jt74" event={"ID":"43d2e0f6-060c-4389-9a1f-5bdb06198e7b","Type":"ContainerStarted","Data":"f573a3b48b7ba7e0ec12acb6ab8278ae04066b44a57a29402d6e04eb105619d2"} Mar 12 13:38:07.561810 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.561786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" event={"ID":"4a7d4073-afc5-478a-8838-a78fa193f1bd","Type":"ContainerStarted","Data":"9c3e2dac2ec761372cb0aca0554b2a472c95f35bd80fda27b6d82e7effe27d67"} Mar 12 13:38:07.562983 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.562961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" event={"ID":"0e6933a6-a221-45a8-ae79-110f7f192c33","Type":"ContainerStarted","Data":"9a5a84dd262781ba27d6a877f818c74f8f14e3f24f07ba81d79b8e27c859097d"} Mar 12 13:38:07.564543 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.564517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w2psw" event={"ID":"4e3e43f3-29b4-45df-8953-0095e22a0d55","Type":"ContainerStarted","Data":"9f5b23a9c8973c96effd1721203cb5c4033cc1d913034e61a3980526c12d1c26"} Mar 12 13:38:07.569053 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.569008 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-20.ec2.internal" podStartSLOduration=20.568996122 podStartE2EDuration="20.568996122s" podCreationTimestamp="2026-03-12 13:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:07.568352814 +0000 UTC m=+21.657876913" watchObservedRunningTime="2026-03-12 13:38:07.568996122 +0000 UTC m=+21.658520205" Mar 12 13:38:07.587216 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.587165 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2jt74" podStartSLOduration=3.879372874 podStartE2EDuration="21.587151452s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:37:48.722521002 +0000 UTC m=+2.812045067" lastFinishedPulling="2026-03-12 13:38:06.43029958 +0000 UTC m=+20.519823645" observedRunningTime="2026-03-12 13:38:07.586622952 +0000 UTC m=+21.676147036" watchObservedRunningTime="2026-03-12 13:38:07.587151452 +0000 UTC m=+21.676675535" Mar 12 13:38:07.643584 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.643537 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g64nq" podStartSLOduration=8.629208029 podStartE2EDuration="21.643520914s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:37:48.712734619 +0000 UTC m=+2.802258688" lastFinishedPulling="2026-03-12 13:38:01.727047503 +0000 UTC m=+15.816571573" observedRunningTime="2026-03-12 13:38:07.643377037 +0000 UTC m=+21.732901124" watchObservedRunningTime="2026-03-12 13:38:07.643520914 +0000 UTC m=+21.733044988" Mar 12 13:38:07.644165 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.644136 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qjdg7" podStartSLOduration=3.936827945 podStartE2EDuration="21.644127815s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:37:48.704417161 +0000 UTC m=+2.793941226" lastFinishedPulling="2026-03-12 13:38:06.411717018 +0000 UTC m=+20.501241096" observedRunningTime="2026-03-12 13:38:07.623586028 +0000 UTC m=+21.713110113" watchObservedRunningTime="2026-03-12 13:38:07.644127815 +0000 UTC m=+21.733651899" Mar 12 13:38:07.659178 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.658289 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w2psw" podStartSLOduration=4.862463107 podStartE2EDuration="21.658274759s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:37:49.61366927 +0000 UTC m=+3.703193346" lastFinishedPulling="2026-03-12 13:38:06.409480923 +0000 UTC m=+20.499004998" observedRunningTime="2026-03-12 13:38:07.658130365 +0000 UTC m=+21.747654450" watchObservedRunningTime="2026-03-12 13:38:07.658274759 +0000 UTC m=+21.747798843" Mar 12 13:38:07.946014 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:07.945989 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 12 13:38:08.463116 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:08.463019 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-12T13:38:07.946007771Z","UUID":"853912e7-d657-49fd-bab8-512048d35d0b","Handler":null,"Name":"","Endpoint":""} Mar 12 13:38:08.465559 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:08.465537 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 12 13:38:08.465559 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:08.465564 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 12 13:38:08.470690 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:08.470672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:08.470825 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:08.470805 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:08.470920 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:08.470886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:08.471031 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:08.471012 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:08.567069 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:08.567034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4rzbn" event={"ID":"7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6","Type":"ContainerStarted","Data":"19ef3e3199b4c908e10139e66b67a6ad0a4779103cee3977868a02fe5fa41478"} Mar 12 13:38:08.570256 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:08.570227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" event={"ID":"0e6933a6-a221-45a8-ae79-110f7f192c33","Type":"ContainerStarted","Data":"3d4e75c042323764ed953b0728740d3fb00927df1f16e9b1fcce0eba8217fe31"} Mar 12 13:38:08.587253 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:08.587206 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4rzbn" podStartSLOduration=5.809521995 podStartE2EDuration="22.587189718s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:37:49.617888303 +0000 UTC m=+3.707412365" lastFinishedPulling="2026-03-12 13:38:06.39555601 +0000 UTC m=+20.485080088" observedRunningTime="2026-03-12 13:38:08.586928112 +0000 UTC m=+22.676452196" watchObservedRunningTime="2026-03-12 13:38:08.587189718 +0000 UTC m=+22.676713806" Mar 12 13:38:09.573181 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:09.573146 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" event={"ID":"0e6933a6-a221-45a8-ae79-110f7f192c33","Type":"ContainerStarted","Data":"9cbc45074811bb4e330cf3f572b4b9f27128a036d2562348fff42783ea2605d7"} Mar 12 13:38:09.575887 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:09.575859 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"b445da08016ad2446ff1291ffccf2579ffe799a12e921d0763e7c66749118557"} Mar 12 13:38:09.594586 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:09.594515 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-257gt" podStartSLOduration=4.131094396 podStartE2EDuration="23.594500309s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:37:49.615878391 +0000 UTC m=+3.705402467" lastFinishedPulling="2026-03-12 13:38:09.079284309 +0000 UTC m=+23.168808380" observedRunningTime="2026-03-12 13:38:09.5931937 +0000 UTC m=+23.682717783" watchObservedRunningTime="2026-03-12 13:38:09.594500309 +0000 UTC m=+23.684024393" Mar 12 13:38:09.636664 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:09.636626 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:38:09.637539 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:09.637514 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:38:10.471918 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:10.471690 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:10.472090 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:10.471752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:10.472090 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:10.471985 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:10.472090 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:10.472045 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:10.577477 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:10.577447 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:38:10.577854 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:10.577713 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w2psw" Mar 12 13:38:12.471564 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:12.471531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:12.472034 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:12.471672 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:12.472034 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:12.471723 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:12.472034 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:12.471836 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:13.586990 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:13.586735 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" event={"ID":"24428774-0c1d-4253-a9b0-384ed1b79796","Type":"ContainerStarted","Data":"8050a54f895d5d39365fdcc69b436dbc54fe0688badd65c521f213f4c41d114e"} Mar 12 13:38:13.587769 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:13.587095 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:38:13.587769 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:13.587124 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:38:13.588409 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:13.588382 2575 generic.go:358] "Generic (PLEG): container finished" podID="c07aa00c-e596-44da-b75d-f3772a7057fd" containerID="3e2ff25e36701440fa3bcc6259d277dd6dd85a3a183dcd5540d0e512bef4d9a6" exitCode=0 Mar 12 13:38:13.588492 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:13.588447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerDied","Data":"3e2ff25e36701440fa3bcc6259d277dd6dd85a3a183dcd5540d0e512bef4d9a6"} Mar 12 13:38:13.601712 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:13.601694 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:38:13.650539 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:13.650496 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" podStartSLOduration=10.478779403 podStartE2EDuration="27.65048418s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:37:49.615144461 +0000 UTC m=+3.704668526" lastFinishedPulling="2026-03-12 13:38:06.786849235 +0000 UTC m=+20.876373303" observedRunningTime="2026-03-12 13:38:13.648840375 +0000 UTC m=+27.738364460" watchObservedRunningTime="2026-03-12 13:38:13.65048418 +0000 UTC m=+27.740008264" Mar 12 13:38:14.471275 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:14.471245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:14.471275 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:14.471265 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:14.471444 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:14.471339 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:14.471444 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:14.471399 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:14.590893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:14.590864 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:38:14.605420 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:14.605396 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:38:14.967717 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:14.967680 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mms2n"] Mar 12 13:38:14.967888 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:14.967800 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:14.967968 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:14.967895 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:14.968337 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:14.968311 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qwv64"] Mar 12 13:38:14.968465 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:14.968438 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:14.968545 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:14.968525 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:15.593966 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:15.593936 2575 generic.go:358] "Generic (PLEG): container finished" podID="c07aa00c-e596-44da-b75d-f3772a7057fd" containerID="12d242dda6af0c2f61ade4b68cbaeaff9c96dc3479c838cf81ec0fe1d8ee672d" exitCode=0 Mar 12 13:38:15.594310 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:15.594017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerDied","Data":"12d242dda6af0c2f61ade4b68cbaeaff9c96dc3479c838cf81ec0fe1d8ee672d"} Mar 12 13:38:16.472245 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:16.472226 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:16.472404 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:16.472302 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:16.472404 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:16.472340 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:16.472508 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:16.472403 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:16.601717 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:16.601545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerStarted","Data":"f44b58538591e61bf1b55aa64804f335b3963d765b502c0379339c64d9f41004"} Mar 12 13:38:17.605150 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:17.605117 2575 generic.go:358] "Generic (PLEG): container finished" podID="c07aa00c-e596-44da-b75d-f3772a7057fd" containerID="f44b58538591e61bf1b55aa64804f335b3963d765b502c0379339c64d9f41004" exitCode=0 Mar 12 13:38:17.605504 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:17.605168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerDied","Data":"f44b58538591e61bf1b55aa64804f335b3963d765b502c0379339c64d9f41004"} Mar 12 13:38:18.471085 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:18.471049 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:18.471376 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:18.471058 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:18.471376 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:18.471189 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwv64" podUID="e076d25a-0359-40a3-8294-d82580c2252e" Mar 12 13:38:18.471376 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:18.471283 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mms2n" podUID="018363d6-b28d-4856-9451-fcf1632349aa" Mar 12 13:38:19.226905 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.226876 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-20.ec2.internal" event="NodeReady" Mar 12 13:38:19.227323 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.227019 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 12 13:38:19.267066 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.267033 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b465c86c9-nbbq2"] Mar 12 13:38:19.273540 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.273512 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.277111 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.277090 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 12 13:38:19.277547 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.277523 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 12 13:38:19.277673 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.277576 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9d7nr\"" Mar 12 13:38:19.277971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.277953 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 12 13:38:19.284049 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.283895 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 12 13:38:19.284369 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.284351 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jjsfd"] Mar 12 13:38:19.287685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.287666 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4jz8n"] Mar 12 13:38:19.287863 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.287750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.291274 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.291255 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:19.292242 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.292216 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 12 13:38:19.292344 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.292287 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.292401 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.292367 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4fw8j\"" Mar 12 13:38:19.292545 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.292528 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.292772 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.292754 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 12 13:38:19.294139 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.293701 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 12 13:38:19.294139 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.294132 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.294491 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.294475 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6llcj\"" Mar 12 13:38:19.294906 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.294888 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b465c86c9-nbbq2"] Mar 12 13:38:19.296250 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.295828 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.303019 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.302997 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4jz8n"] Mar 12 13:38:19.318559 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.318473 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jjsfd"] Mar 12 13:38:19.345693 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.345641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2278add-3ad4-46db-9278-d8a8cab50031-ca-trust-extracted\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.345855 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.345706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-trusted-ca\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.345855 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.345732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-image-registry-private-configuration\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.345855 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.345762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.345855 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.345829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgkr\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-kube-api-access-gdgkr\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.346065 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.345864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-registry-certificates\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.346065 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.345889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-bound-sa-token\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.346065 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.345914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-installation-pull-secrets\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447228 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2278add-3ad4-46db-9278-d8a8cab50031-ca-trust-extracted\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447396 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-trusted-ca\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447396 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59289eca-3781-496b-9498-b1ba7c5d593e-config-volume\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.447396 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-image-registry-private-configuration\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447396 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:19.447396 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkndf\" (UniqueName: \"kubernetes.io/projected/59289eca-3781-496b-9498-b1ba7c5d593e-kube-api-access-kkndf\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.447396 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/59289eca-3781-496b-9498-b1ba7c5d593e-tmp-dir\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgkr\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-kube-api-access-gdgkr\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.447465 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.447485 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b465c86c9-nbbq2: secret "image-registry-tls" not found Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-registry-certificates\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-bound-sa-token\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.447543 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls podName:f2278add-3ad4-46db-9278-d8a8cab50031 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.947527624 +0000 UTC m=+34.037051687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls") pod "image-registry-6b465c86c9-nbbq2" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031") : secret "image-registry-tls" not found Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-installation-pull-secrets\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447597 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lhz4\" (UniqueName: \"kubernetes.io/projected/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-kube-api-access-9lhz4\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:19.447741 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.447627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2278add-3ad4-46db-9278-d8a8cab50031-ca-trust-extracted\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.448344 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.448323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-registry-certificates\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.448433 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.448423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-trusted-ca\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.452074 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.452050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-installation-pull-secrets\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.452183 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.452050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-image-registry-private-configuration\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.460000 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.459956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgkr\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-kube-api-access-gdgkr\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.460132 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.460107 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-bound-sa-token\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.508383 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.508357 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-blv4t"] Mar 12 13:38:19.513483 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.513469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.516373 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.516356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lmr5h\"" Mar 12 13:38:19.548895 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.548865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59289eca-3781-496b-9498-b1ba7c5d593e-config-volume\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.548998 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.548899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:19.548998 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.548920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkndf\" (UniqueName: \"kubernetes.io/projected/59289eca-3781-496b-9498-b1ba7c5d593e-kube-api-access-kkndf\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.548998 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.548978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/59289eca-3781-496b-9498-b1ba7c5d593e-tmp-dir\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.549098 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.549013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.549098 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.549045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lhz4\" (UniqueName: \"kubernetes.io/projected/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-kube-api-access-9lhz4\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:19.549180 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.549158 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:19.549180 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.549168 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:19.549271 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.549220 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert podName:3eb6f831-4019-43bf-9cec-d541e8e0f1dc nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.049203233 +0000 UTC m=+34.138727300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert") pod "ingress-canary-4jz8n" (UID: "3eb6f831-4019-43bf-9cec-d541e8e0f1dc") : secret "canary-serving-cert" not found Mar 12 13:38:19.549271 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.549238 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls podName:59289eca-3781-496b-9498-b1ba7c5d593e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.049229842 +0000 UTC m=+34.138753904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls") pod "dns-default-jjsfd" (UID: "59289eca-3781-496b-9498-b1ba7c5d593e") : secret "dns-default-metrics-tls" not found Mar 12 13:38:19.549386 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.549335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/59289eca-3781-496b-9498-b1ba7c5d593e-tmp-dir\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.549719 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.549692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59289eca-3781-496b-9498-b1ba7c5d593e-config-volume\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.560960 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.560933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkndf\" (UniqueName: \"kubernetes.io/projected/59289eca-3781-496b-9498-b1ba7c5d593e-kube-api-access-kkndf\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:19.572087 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.572037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lhz4\" (UniqueName: \"kubernetes.io/projected/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-kube-api-access-9lhz4\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:19.650518 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.650479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-hosts-file\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.650702 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.650527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-tmp-dir\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.650702 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.650544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnkf\" (UniqueName: \"kubernetes.io/projected/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-kube-api-access-8nnkf\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.751244 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.751209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-hosts-file\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.751244 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.751243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-tmp-dir\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.751462 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.751328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-hosts-file\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.751462 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.751351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnkf\" (UniqueName: \"kubernetes.io/projected/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-kube-api-access-8nnkf\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.751565 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.751495 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-tmp-dir\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.769912 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.769881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnkf\" (UniqueName: \"kubernetes.io/projected/ab59b1c8-0dc2-45d5-aea6-a91ec018f894-kube-api-access-8nnkf\") pod \"node-resolver-blv4t\" (UID: \"ab59b1c8-0dc2-45d5-aea6-a91ec018f894\") " pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.822027 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.821996 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-blv4t" Mar 12 13:38:19.830426 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:19.830385 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab59b1c8_0dc2_45d5_aea6_a91ec018f894.slice/crio-2e61498a9c97b262c1978f88270df9d7ea4fd8ef57f935bd478a5b9dd398f762 WatchSource:0}: Error finding container 2e61498a9c97b262c1978f88270df9d7ea4fd8ef57f935bd478a5b9dd398f762: Status 404 returned error can't find the container with id 2e61498a9c97b262c1978f88270df9d7ea4fd8ef57f935bd478a5b9dd398f762 Mar 12 13:38:19.952488 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:19.952460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:19.952584 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.952573 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:19.952633 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.952588 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b465c86c9-nbbq2: secret "image-registry-tls" not found Mar 12 13:38:19.952707 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:19.952635 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls podName:f2278add-3ad4-46db-9278-d8a8cab50031 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.952619879 +0000 UTC m=+35.042143947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls") pod "image-registry-6b465c86c9-nbbq2" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031") : secret "image-registry-tls" not found Mar 12 13:38:20.053106 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.053077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:20.053261 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.053164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:20.053309 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.053254 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:20.053309 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.053307 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:20.053379 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.053333 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls podName:59289eca-3781-496b-9498-b1ba7c5d593e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:21.053314246 +0000 UTC m=+35.142838311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls") pod "dns-default-jjsfd" (UID: "59289eca-3781-496b-9498-b1ba7c5d593e") : secret "dns-default-metrics-tls" not found Mar 12 13:38:20.053379 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.053350 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert podName:3eb6f831-4019-43bf-9cec-d541e8e0f1dc nodeName:}" failed. No retries permitted until 2026-03-12 13:38:21.053341829 +0000 UTC m=+35.142865890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert") pod "ingress-canary-4jz8n" (UID: "3eb6f831-4019-43bf-9cec-d541e8e0f1dc") : secret "canary-serving-cert" not found Mar 12 13:38:20.256443 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.256402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:20.257332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.256454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:20.257332 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.256586 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:38:20.257332 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.256604 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:38:20.257332 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.256617 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4xbfm for pod openshift-network-diagnostics/network-check-target-mms2n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:38:20.257332 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.256585 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:38:20.257332 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.256687 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm podName:018363d6-b28d-4856-9451-fcf1632349aa nodeName:}" failed. No retries permitted until 2026-03-12 13:38:52.256667289 +0000 UTC m=+66.346191378 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4xbfm" (UniqueName: "kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm") pod "network-check-target-mms2n" (UID: "018363d6-b28d-4856-9451-fcf1632349aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:38:20.257332 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.256705 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs podName:e076d25a-0359-40a3-8294-d82580c2252e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:52.256694745 +0000 UTC m=+66.346218808 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs") pod "network-metrics-daemon-qwv64" (UID: "e076d25a-0359-40a3-8294-d82580c2252e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:38:20.474334 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.474303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:20.474492 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.474303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:20.477584 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.477555 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 12 13:38:20.477729 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.477691 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sj9kz\"" Mar 12 13:38:20.477729 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.477704 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbttq\"" Mar 12 13:38:20.477846 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.477784 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 12 13:38:20.477846 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.477832 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 12 13:38:20.612666 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.612623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-blv4t" event={"ID":"ab59b1c8-0dc2-45d5-aea6-a91ec018f894","Type":"ContainerStarted","Data":"c2f65e8dfe125c93bb7c4050828ac5746418476750738647726a3aff084e84fb"} Mar 12 13:38:20.612800 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.612684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-blv4t" event={"ID":"ab59b1c8-0dc2-45d5-aea6-a91ec018f894","Type":"ContainerStarted","Data":"2e61498a9c97b262c1978f88270df9d7ea4fd8ef57f935bd478a5b9dd398f762"} Mar 12 13:38:20.628827 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.628777 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-blv4t" podStartSLOduration=1.628762578 podStartE2EDuration="1.628762578s" podCreationTimestamp="2026-03-12 13:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:20.628255087 +0000 UTC m=+34.717779172" watchObservedRunningTime="2026-03-12 13:38:20.628762578 +0000 UTC m=+34.718286662" Mar 12 13:38:20.962091 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:20.962010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:20.962251 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.962139 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:20.962251 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.962153 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b465c86c9-nbbq2: secret "image-registry-tls" not found Mar 12 13:38:20.962251 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:20.962223 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls podName:f2278add-3ad4-46db-9278-d8a8cab50031 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.96220262 +0000 UTC m=+37.051726682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls") pod "image-registry-6b465c86c9-nbbq2" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031") : secret "image-registry-tls" not found Mar 12 13:38:21.063177 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:21.063142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:21.063377 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:21.063191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:21.063377 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:21.063303 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:21.063377 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:21.063303 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:21.063377 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:21.063367 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls podName:59289eca-3781-496b-9498-b1ba7c5d593e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:23.063349043 +0000 UTC m=+37.152873112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls") pod "dns-default-jjsfd" (UID: "59289eca-3781-496b-9498-b1ba7c5d593e") : secret "dns-default-metrics-tls" not found Mar 12 13:38:21.063570 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:21.063390 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert podName:3eb6f831-4019-43bf-9cec-d541e8e0f1dc nodeName:}" failed. No retries permitted until 2026-03-12 13:38:23.063381862 +0000 UTC m=+37.152905923 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert") pod "ingress-canary-4jz8n" (UID: "3eb6f831-4019-43bf-9cec-d541e8e0f1dc") : secret "canary-serving-cert" not found Mar 12 13:38:22.979956 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:22.979928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:22.980287 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:22.980066 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:22.980287 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:22.980079 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b465c86c9-nbbq2: secret "image-registry-tls" not found Mar 12 13:38:22.980287 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:22.980135 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls podName:f2278add-3ad4-46db-9278-d8a8cab50031 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.980116173 +0000 UTC m=+41.069640250 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls") pod "image-registry-6b465c86c9-nbbq2" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031") : secret "image-registry-tls" not found Mar 12 13:38:23.080761 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:23.080559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:23.080825 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:23.080806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:23.080860 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:23.080712 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:23.080905 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:23.080895 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert podName:3eb6f831-4019-43bf-9cec-d541e8e0f1dc nodeName:}" failed. No retries permitted until 2026-03-12 13:38:27.080874459 +0000 UTC m=+41.170398528 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert") pod "ingress-canary-4jz8n" (UID: "3eb6f831-4019-43bf-9cec-d541e8e0f1dc") : secret "canary-serving-cert" not found Mar 12 13:38:23.080951 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:23.080895 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:23.080994 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:23.080933 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls podName:59289eca-3781-496b-9498-b1ba7c5d593e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:27.0809238 +0000 UTC m=+41.170447863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls") pod "dns-default-jjsfd" (UID: "59289eca-3781-496b-9498-b1ba7c5d593e") : secret "dns-default-metrics-tls" not found Mar 12 13:38:23.620155 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:23.620123 2575 generic.go:358] "Generic (PLEG): container finished" podID="c07aa00c-e596-44da-b75d-f3772a7057fd" containerID="752827e1a2505868647dd6335211330004ee6e3271f9f2b686bc188862d8d5d0" exitCode=0 Mar 12 13:38:23.620311 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:23.620168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerDied","Data":"752827e1a2505868647dd6335211330004ee6e3271f9f2b686bc188862d8d5d0"} Mar 12 13:38:24.623976 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:24.623948 2575 generic.go:358] "Generic (PLEG): container finished" podID="c07aa00c-e596-44da-b75d-f3772a7057fd" containerID="f4ed685221df9e959d64e2d25bd0f4a823b91a5681617f497692beb18852938b" exitCode=0 Mar 12 13:38:24.624370 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:24.623989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerDied","Data":"f4ed685221df9e959d64e2d25bd0f4a823b91a5681617f497692beb18852938b"} Mar 12 13:38:25.628073 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:25.628029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" event={"ID":"c07aa00c-e596-44da-b75d-f3772a7057fd","Type":"ContainerStarted","Data":"aecdb59ca65c75cb35a35f93b68173359fe3ee368dba121fd93169bfe79db2b2"} Mar 12 13:38:25.656198 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:25.656151 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qq8v5" podStartSLOduration=6.319248508 podStartE2EDuration="39.656137155s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:37:49.595629495 +0000 UTC m=+3.685153559" lastFinishedPulling="2026-03-12 13:38:22.932518144 +0000 UTC m=+37.022042206" observedRunningTime="2026-03-12 13:38:25.654483029 +0000 UTC m=+39.744007113" watchObservedRunningTime="2026-03-12 13:38:25.656137155 +0000 UTC m=+39.745661238" Mar 12 13:38:27.011427 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:27.011388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:27.011837 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:27.011498 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:27.011837 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:27.011509 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b465c86c9-nbbq2: secret "image-registry-tls" not found Mar 12 13:38:27.011837 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:27.011554 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls podName:f2278add-3ad4-46db-9278-d8a8cab50031 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:35.011541713 +0000 UTC m=+49.101065775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls") pod "image-registry-6b465c86c9-nbbq2" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031") : secret "image-registry-tls" not found Mar 12 13:38:27.112570 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:27.112540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:27.112736 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:27.112587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:27.112736 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:27.112705 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:27.112736 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:27.112721 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:27.112850 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:27.112769 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert podName:3eb6f831-4019-43bf-9cec-d541e8e0f1dc nodeName:}" failed. No retries permitted until 2026-03-12 13:38:35.112754508 +0000 UTC m=+49.202278570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert") pod "ingress-canary-4jz8n" (UID: "3eb6f831-4019-43bf-9cec-d541e8e0f1dc") : secret "canary-serving-cert" not found Mar 12 13:38:27.112850 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:27.112785 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls podName:59289eca-3781-496b-9498-b1ba7c5d593e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:35.112776488 +0000 UTC m=+49.202300550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls") pod "dns-default-jjsfd" (UID: "59289eca-3781-496b-9498-b1ba7c5d593e") : secret "dns-default-metrics-tls" not found Mar 12 13:38:32.029469 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.029437 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs"] Mar 12 13:38:32.037522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.037502 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" Mar 12 13:38:32.040847 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.040821 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs"] Mar 12 13:38:32.041038 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.041024 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-hnxdf\"" Mar 12 13:38:32.041089 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.041039 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 12 13:38:32.041342 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.041328 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 12 13:38:32.151904 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.151869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg8xd\" (UniqueName: \"kubernetes.io/projected/6819031a-6b93-42f2-b7d5-28fc80fafb35-kube-api-access-pg8xd\") pod \"migrator-6b589cdcc-ftzcs\" (UID: \"6819031a-6b93-42f2-b7d5-28fc80fafb35\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" Mar 12 13:38:32.253138 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.253106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg8xd\" (UniqueName: \"kubernetes.io/projected/6819031a-6b93-42f2-b7d5-28fc80fafb35-kube-api-access-pg8xd\") pod \"migrator-6b589cdcc-ftzcs\" (UID: \"6819031a-6b93-42f2-b7d5-28fc80fafb35\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" Mar 12 13:38:32.264318 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.264289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg8xd\" (UniqueName: \"kubernetes.io/projected/6819031a-6b93-42f2-b7d5-28fc80fafb35-kube-api-access-pg8xd\") pod \"migrator-6b589cdcc-ftzcs\" (UID: \"6819031a-6b93-42f2-b7d5-28fc80fafb35\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" Mar 12 13:38:32.355593 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.355562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" Mar 12 13:38:32.511489 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.511463 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs"] Mar 12 13:38:32.515626 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:32.515601 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6819031a_6b93_42f2_b7d5_28fc80fafb35.slice/crio-a5d7bb9a58c640bfd1413f7cf50309ee2f2e4adce4b6c3b7eff4be9a48421b94 WatchSource:0}: Error finding container a5d7bb9a58c640bfd1413f7cf50309ee2f2e4adce4b6c3b7eff4be9a48421b94: Status 404 returned error can't find the container with id a5d7bb9a58c640bfd1413f7cf50309ee2f2e4adce4b6c3b7eff4be9a48421b94 Mar 12 13:38:32.644986 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:32.644901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" event={"ID":"6819031a-6b93-42f2-b7d5-28fc80fafb35","Type":"ContainerStarted","Data":"a5d7bb9a58c640bfd1413f7cf50309ee2f2e4adce4b6c3b7eff4be9a48421b94"} Mar 12 13:38:33.052371 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.052292 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-blv4t_ab59b1c8-0dc2-45d5-aea6-a91ec018f894/dns-node-resolver/0.log" Mar 12 13:38:33.264592 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.264561 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-8vhtk"] Mar 12 13:38:33.278630 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.278603 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.283913 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.283880 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 12 13:38:33.284136 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.284120 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qvxcc\"" Mar 12 13:38:33.284244 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.284120 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 12 13:38:33.285136 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.285112 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 12 13:38:33.291543 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.291521 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 12 13:38:33.292920 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.292898 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-8vhtk"] Mar 12 13:38:33.462797 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.462763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-signing-key\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.462797 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.462799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxfcn\" (UniqueName: \"kubernetes.io/projected/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-kube-api-access-mxfcn\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.463023 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.462859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-signing-cabundle\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.563736 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.563694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-signing-key\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.563736 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.563738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxfcn\" (UniqueName: \"kubernetes.io/projected/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-kube-api-access-mxfcn\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.563952 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.563926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-signing-cabundle\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.564584 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.564567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-signing-cabundle\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.566123 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.566105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-signing-key\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.572373 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.572353 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxfcn\" (UniqueName: \"kubernetes.io/projected/8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec-kube-api-access-mxfcn\") pod \"service-ca-8bb587b94-8vhtk\" (UID: \"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec\") " pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.589894 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.589872 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" Mar 12 13:38:33.712182 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.712149 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-8vhtk"] Mar 12 13:38:33.852467 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:33.852442 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g64nq_fc8195c5-3667-46e7-8bca-1b80b2d9943d/node-ca/0.log" Mar 12 13:38:33.935787 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:33.935754 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bae6c3a_0134_4d0c_afd0_348ccdfbf8ec.slice/crio-b6190287eb2b8cf885a091bb476676635bbaa42cf2886bfaee1f9b804ce37592 WatchSource:0}: Error finding container b6190287eb2b8cf885a091bb476676635bbaa42cf2886bfaee1f9b804ce37592: Status 404 returned error can't find the container with id b6190287eb2b8cf885a091bb476676635bbaa42cf2886bfaee1f9b804ce37592 Mar 12 13:38:34.131449 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.131419 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl"] Mar 12 13:38:34.139897 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.139872 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.142447 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.142419 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl"] Mar 12 13:38:34.143232 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.143034 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Mar 12 13:38:34.143232 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.143081 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Mar 12 13:38:34.143232 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.143106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Mar 12 13:38:34.143232 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.143088 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Mar 12 13:38:34.166047 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.166017 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x"] Mar 12 13:38:34.174701 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.174643 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" Mar 12 13:38:34.175917 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.175891 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x"] Mar 12 13:38:34.177803 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.177710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-922c2\"" Mar 12 13:38:34.177803 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.177710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Mar 12 13:38:34.187978 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.187961 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65"] Mar 12 13:38:34.199626 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.199607 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.200478 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.200461 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65"] Mar 12 13:38:34.202671 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.202614 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Mar 12 13:38:34.202753 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.202620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Mar 12 13:38:34.202926 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.202909 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Mar 12 13:38:34.203260 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.203237 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Mar 12 13:38:34.270574 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.270554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0a4355cf-d460-4c51-9da2-1e9b85060177-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-594677db77-hl94x\" (UID: \"0a4355cf-d460-4c51-9da2-1e9b85060177\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" Mar 12 13:38:34.270711 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.270584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbj6x\" (UniqueName: \"kubernetes.io/projected/0a4355cf-d460-4c51-9da2-1e9b85060177-kube-api-access-hbj6x\") pod \"managed-serviceaccount-addon-agent-594677db77-hl94x\" (UID: \"0a4355cf-d460-4c51-9da2-1e9b85060177\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" Mar 12 13:38:34.270711 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.270668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b85e145-42e9-43ee-9dad-7fd0a20e145f-tmp\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.270711 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.270701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b85e145-42e9-43ee-9dad-7fd0a20e145f-klusterlet-config\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.270811 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.270717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt4dq\" (UniqueName: \"kubernetes.io/projected/6b85e145-42e9-43ee-9dad-7fd0a20e145f-kube-api-access-rt4dq\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.371388 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-ca\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.371486 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0a4355cf-d460-4c51-9da2-1e9b85060177-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-594677db77-hl94x\" (UID: \"0a4355cf-d460-4c51-9da2-1e9b85060177\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" Mar 12 13:38:34.371486 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbj6x\" (UniqueName: \"kubernetes.io/projected/0a4355cf-d460-4c51-9da2-1e9b85060177-kube-api-access-hbj6x\") pod \"managed-serviceaccount-addon-agent-594677db77-hl94x\" (UID: \"0a4355cf-d460-4c51-9da2-1e9b85060177\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" Mar 12 13:38:34.371486 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-hub\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.371619 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.371619 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b85e145-42e9-43ee-9dad-7fd0a20e145f-tmp\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.371743 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.371792 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.371853 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b85e145-42e9-43ee-9dad-7fd0a20e145f-klusterlet-config\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.371853 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt4dq\" (UniqueName: \"kubernetes.io/projected/6b85e145-42e9-43ee-9dad-7fd0a20e145f-kube-api-access-rt4dq\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.371853 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b85e145-42e9-43ee-9dad-7fd0a20e145f-tmp\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.371969 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.371851 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knzrh\" (UniqueName: \"kubernetes.io/projected/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-kube-api-access-knzrh\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.373909 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.373890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0a4355cf-d460-4c51-9da2-1e9b85060177-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-594677db77-hl94x\" (UID: \"0a4355cf-d460-4c51-9da2-1e9b85060177\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" Mar 12 13:38:34.374059 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.374044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b85e145-42e9-43ee-9dad-7fd0a20e145f-klusterlet-config\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.380883 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.380861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt4dq\" (UniqueName: \"kubernetes.io/projected/6b85e145-42e9-43ee-9dad-7fd0a20e145f-kube-api-access-rt4dq\") pod \"klusterlet-addon-workmgr-756d7bbb78-j8cnl\" (UID: \"6b85e145-42e9-43ee-9dad-7fd0a20e145f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.384761 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.384745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbj6x\" (UniqueName: \"kubernetes.io/projected/0a4355cf-d460-4c51-9da2-1e9b85060177-kube-api-access-hbj6x\") pod \"managed-serviceaccount-addon-agent-594677db77-hl94x\" (UID: \"0a4355cf-d460-4c51-9da2-1e9b85060177\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" Mar 12 13:38:34.453247 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.453221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:34.472971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.472284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-hub\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.472971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.472327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.472971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.472383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.472971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.472415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.472971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.472453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knzrh\" (UniqueName: \"kubernetes.io/projected/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-kube-api-access-knzrh\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.472971 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.472504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-ca\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.474170 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.474141 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.475403 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.475351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-ca\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.475403 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.475389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.476376 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.476349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-hub\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.476894 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.476876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.481641 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.481621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knzrh\" (UniqueName: \"kubernetes.io/projected/60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2-kube-api-access-knzrh\") pod \"cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65\" (UID: \"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.498423 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.498400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" Mar 12 13:38:34.508352 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.508281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:38:34.603989 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.603939 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl"] Mar 12 13:38:34.609851 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:34.609819 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b85e145_42e9_43ee_9dad_7fd0a20e145f.slice/crio-54d096945a136c70d76e99af7f5b2b0517d17c58aa846632db7c7237f7bf19d2 WatchSource:0}: Error finding container 54d096945a136c70d76e99af7f5b2b0517d17c58aa846632db7c7237f7bf19d2: Status 404 returned error can't find the container with id 54d096945a136c70d76e99af7f5b2b0517d17c58aa846632db7c7237f7bf19d2 Mar 12 13:38:34.652099 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.652008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" event={"ID":"6b85e145-42e9-43ee-9dad-7fd0a20e145f","Type":"ContainerStarted","Data":"54d096945a136c70d76e99af7f5b2b0517d17c58aa846632db7c7237f7bf19d2"} Mar 12 13:38:34.654130 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.654102 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x"] Mar 12 13:38:34.654787 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.654644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" event={"ID":"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec","Type":"ContainerStarted","Data":"b6190287eb2b8cf885a091bb476676635bbaa42cf2886bfaee1f9b804ce37592"} Mar 12 13:38:34.656662 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.656597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" event={"ID":"6819031a-6b93-42f2-b7d5-28fc80fafb35","Type":"ContainerStarted","Data":"9bacd8d3cbe068c7bd483b6e1f2bdccd27e96932d1e486aeba5289844eeebd72"} Mar 12 13:38:34.656662 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.656627 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" event={"ID":"6819031a-6b93-42f2-b7d5-28fc80fafb35","Type":"ContainerStarted","Data":"d9eba64f786dc5a0d4332d43921ac6b3d6309a79970cf1352cfb3f31c9bc6c3d"} Mar 12 13:38:34.657256 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:34.657234 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4355cf_d460_4c51_9da2_1e9b85060177.slice/crio-ec2345e41bf2f4696a7f7cb804403ec4eccacc4b352ae5f8c7664029ce8af3c2 WatchSource:0}: Error finding container ec2345e41bf2f4696a7f7cb804403ec4eccacc4b352ae5f8c7664029ce8af3c2: Status 404 returned error can't find the container with id ec2345e41bf2f4696a7f7cb804403ec4eccacc4b352ae5f8c7664029ce8af3c2 Mar 12 13:38:34.683268 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.683218 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-ftzcs" podStartSLOduration=1.00975207 podStartE2EDuration="2.68320158s" podCreationTimestamp="2026-03-12 13:38:32 +0000 UTC" firstStartedPulling="2026-03-12 13:38:32.517466657 +0000 UTC m=+46.606990722" lastFinishedPulling="2026-03-12 13:38:34.190916167 +0000 UTC m=+48.280440232" observedRunningTime="2026-03-12 13:38:34.68092028 +0000 UTC m=+48.770444364" watchObservedRunningTime="2026-03-12 13:38:34.68320158 +0000 UTC m=+48.772725664" Mar 12 13:38:34.685061 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:34.684420 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65"] Mar 12 13:38:34.686597 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:34.686567 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60c6ec8d_d6b8_4014_a47d_bdf80ab81ff2.slice/crio-1214cb3f1a7904b77093189eaf3da63b366388bc7be604826397859136f16c10 WatchSource:0}: Error finding container 1214cb3f1a7904b77093189eaf3da63b366388bc7be604826397859136f16c10: Status 404 returned error can't find the container with id 1214cb3f1a7904b77093189eaf3da63b366388bc7be604826397859136f16c10 Mar 12 13:38:35.081597 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.081564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:35.081808 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.081700 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:35.081808 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.081729 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b465c86c9-nbbq2: secret "image-registry-tls" not found Mar 12 13:38:35.081808 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.081792 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls podName:f2278add-3ad4-46db-9278-d8a8cab50031 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:51.081777417 +0000 UTC m=+65.171301479 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls") pod "image-registry-6b465c86c9-nbbq2" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031") : secret "image-registry-tls" not found Mar 12 13:38:35.182609 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.182571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:35.182972 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.182684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:35.182972 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.182746 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:35.182972 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.182779 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:35.182972 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.182822 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls podName:59289eca-3781-496b-9498-b1ba7c5d593e nodeName:}" failed. No retries permitted until 2026-03-12 13:38:51.182804728 +0000 UTC m=+65.272328790 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls") pod "dns-default-jjsfd" (UID: "59289eca-3781-496b-9498-b1ba7c5d593e") : secret "dns-default-metrics-tls" not found Mar 12 13:38:35.182972 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.182837 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert podName:3eb6f831-4019-43bf-9cec-d541e8e0f1dc nodeName:}" failed. No retries permitted until 2026-03-12 13:38:51.182830979 +0000 UTC m=+65.272355042 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert") pod "ingress-canary-4jz8n" (UID: "3eb6f831-4019-43bf-9cec-d541e8e0f1dc") : secret "canary-serving-cert" not found Mar 12 13:38:35.202067 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.202032 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5nw2x"] Mar 12 13:38:35.217998 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.217976 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5nw2x"] Mar 12 13:38:35.218138 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.218114 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.221332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.220817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 12 13:38:35.221332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.220867 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 12 13:38:35.221332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.220884 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 12 13:38:35.221332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.220914 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4sgvv\"" Mar 12 13:38:35.221332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.221177 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 12 13:38:35.283416 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.283384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.283542 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.283501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35d3b28f-1cd7-403d-b055-e9982477c6c5-data-volume\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.283542 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.283531 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35d3b28f-1cd7-403d-b055-e9982477c6c5-crio-socket\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.283678 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.283574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gx9p\" (UniqueName: \"kubernetes.io/projected/35d3b28f-1cd7-403d-b055-e9982477c6c5-kube-api-access-6gx9p\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.283678 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.283592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35d3b28f-1cd7-403d-b055-e9982477c6c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.383921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gx9p\" (UniqueName: \"kubernetes.io/projected/35d3b28f-1cd7-403d-b055-e9982477c6c5-kube-api-access-6gx9p\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.383978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35d3b28f-1cd7-403d-b055-e9982477c6c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.384045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.384126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35d3b28f-1cd7-403d-b055-e9982477c6c5-data-volume\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.384160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35d3b28f-1cd7-403d-b055-e9982477c6c5-crio-socket\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.384245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35d3b28f-1cd7-403d-b055-e9982477c6c5-crio-socket\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.384344 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.384406 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls podName:35d3b28f-1cd7-403d-b055-e9982477c6c5 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:35.884384686 +0000 UTC m=+49.973908749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5nw2x" (UID: "35d3b28f-1cd7-403d-b055-e9982477c6c5") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.384515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35d3b28f-1cd7-403d-b055-e9982477c6c5-data-volume\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.385112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.385031 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35d3b28f-1cd7-403d-b055-e9982477c6c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.393766 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.393738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gx9p\" (UniqueName: \"kubernetes.io/projected/35d3b28f-1cd7-403d-b055-e9982477c6c5-kube-api-access-6gx9p\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.660342 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.660253 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" event={"ID":"0a4355cf-d460-4c51-9da2-1e9b85060177","Type":"ContainerStarted","Data":"ec2345e41bf2f4696a7f7cb804403ec4eccacc4b352ae5f8c7664029ce8af3c2"} Mar 12 13:38:35.662049 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.661992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" event={"ID":"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2","Type":"ContainerStarted","Data":"1214cb3f1a7904b77093189eaf3da63b366388bc7be604826397859136f16c10"} Mar 12 13:38:35.888039 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:35.887998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:35.888220 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.888178 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:35.888279 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:35.888247 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls podName:35d3b28f-1cd7-403d-b055-e9982477c6c5 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:36.888225681 +0000 UTC m=+50.977749761 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5nw2x" (UID: "35d3b28f-1cd7-403d-b055-e9982477c6c5") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:36.894818 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:36.894782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:36.895361 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:36.895031 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:36.895361 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:36.895090 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls podName:35d3b28f-1cd7-403d-b055-e9982477c6c5 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:38.895076208 +0000 UTC m=+52.984600270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5nw2x" (UID: "35d3b28f-1cd7-403d-b055-e9982477c6c5") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:38.908452 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:38.908413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:38.908917 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:38.908582 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:38.908917 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:38.908677 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls podName:35d3b28f-1cd7-403d-b055-e9982477c6c5 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:42.908639087 +0000 UTC m=+56.998163152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5nw2x" (UID: "35d3b28f-1cd7-403d-b055-e9982477c6c5") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:40.674483 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.674445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" event={"ID":"6b85e145-42e9-43ee-9dad-7fd0a20e145f","Type":"ContainerStarted","Data":"1731dee39f2f2169db0665721536b0e99d61fa4df1e993e4f95ead0c3477390a"} Mar 12 13:38:40.674965 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.674616 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:40.676069 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.676036 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" event={"ID":"0a4355cf-d460-4c51-9da2-1e9b85060177","Type":"ContainerStarted","Data":"83c0b8e5f52bf2bc52449ef7bea227edb5fcb2d474a5a28d41b0dedcdff81bc9"} Mar 12 13:38:40.676734 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.676711 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" Mar 12 13:38:40.677374 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.677351 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" event={"ID":"8bae6c3a-0134-4d0c-afd0-348ccdfbf8ec","Type":"ContainerStarted","Data":"b5493d46ced08dfdab51a4a4408a35f4136a529fd35ad96a0adad158b8ae2dfa"} Mar 12 13:38:40.678600 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.678577 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" event={"ID":"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2","Type":"ContainerStarted","Data":"133d9dc7c7727cf1e3e7a468741855df1b94f46f81719bae029b7a1721c406c3"} Mar 12 13:38:40.692048 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.692003 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-756d7bbb78-j8cnl" podStartSLOduration=1.556001094 podStartE2EDuration="6.6919915s" podCreationTimestamp="2026-03-12 13:38:34 +0000 UTC" firstStartedPulling="2026-03-12 13:38:34.612151538 +0000 UTC m=+48.701675602" lastFinishedPulling="2026-03-12 13:38:39.748141932 +0000 UTC m=+53.837666008" observedRunningTime="2026-03-12 13:38:40.691193946 +0000 UTC m=+54.780718036" watchObservedRunningTime="2026-03-12 13:38:40.6919915 +0000 UTC m=+54.781515584" Mar 12 13:38:40.708418 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.708382 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-8bb587b94-8vhtk" podStartSLOduration=2.33769329 podStartE2EDuration="7.708372s" podCreationTimestamp="2026-03-12 13:38:33 +0000 UTC" firstStartedPulling="2026-03-12 13:38:33.937781435 +0000 UTC m=+48.027305500" lastFinishedPulling="2026-03-12 13:38:39.308460144 +0000 UTC m=+53.397984210" observedRunningTime="2026-03-12 13:38:40.707704209 +0000 UTC m=+54.797228319" watchObservedRunningTime="2026-03-12 13:38:40.708372 +0000 UTC m=+54.797896120" Mar 12 13:38:40.739717 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:40.739639 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-594677db77-hl94x" podStartSLOduration=1.646938526 podStartE2EDuration="6.739624119s" podCreationTimestamp="2026-03-12 13:38:34 +0000 UTC" firstStartedPulling="2026-03-12 13:38:34.659088496 +0000 UTC m=+48.748612558" lastFinishedPulling="2026-03-12 13:38:39.751774089 +0000 UTC m=+53.841298151" observedRunningTime="2026-03-12 13:38:40.738872888 +0000 UTC m=+54.828396971" watchObservedRunningTime="2026-03-12 13:38:40.739624119 +0000 UTC m=+54.829148207" Mar 12 13:38:42.687221 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:42.687131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" event={"ID":"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2","Type":"ContainerStarted","Data":"9d2143ad123a3f6187cf0f2bc93a7111ca2f354376bc57e7b218f159126ee816"} Mar 12 13:38:42.687221 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:42.687176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" event={"ID":"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2","Type":"ContainerStarted","Data":"cf09dfafbc4a9471d4024eb8b2596004e2baa7d6417c723a71970a22a838253a"} Mar 12 13:38:42.706608 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:42.706560 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" podStartSLOduration=1.036906896 podStartE2EDuration="8.706546143s" podCreationTimestamp="2026-03-12 13:38:34 +0000 UTC" firstStartedPulling="2026-03-12 13:38:34.688748462 +0000 UTC m=+48.778272530" lastFinishedPulling="2026-03-12 13:38:42.358387712 +0000 UTC m=+56.447911777" observedRunningTime="2026-03-12 13:38:42.705585239 +0000 UTC m=+56.795109325" watchObservedRunningTime="2026-03-12 13:38:42.706546143 +0000 UTC m=+56.796070227" Mar 12 13:38:42.944908 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:42.944824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:42.945033 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:42.944995 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:42.945088 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:38:42.945077 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls podName:35d3b28f-1cd7-403d-b055-e9982477c6c5 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.945056038 +0000 UTC m=+65.034580106 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5nw2x" (UID: "35d3b28f-1cd7-403d-b055-e9982477c6c5") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:44.511304 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:44.511253 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" podUID="60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:38:46.612308 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:46.612261 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-plcmr" Mar 12 13:38:51.011625 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.011586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:51.014027 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.014003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d3b28f-1cd7-403d-b055-e9982477c6c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5nw2x\" (UID: \"35d3b28f-1cd7-403d-b055-e9982477c6c5\") " pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:51.112792 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.112757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:51.115326 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.115299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"image-registry-6b465c86c9-nbbq2\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:51.134246 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.134220 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4sgvv\"" Mar 12 13:38:51.139637 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.139621 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5nw2x" Mar 12 13:38:51.214122 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.214087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:51.214269 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.214172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:51.216562 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.216514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3eb6f831-4019-43bf-9cec-d541e8e0f1dc-cert\") pod \"ingress-canary-4jz8n\" (UID: \"3eb6f831-4019-43bf-9cec-d541e8e0f1dc\") " pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:51.216694 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.216635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59289eca-3781-496b-9498-b1ba7c5d593e-metrics-tls\") pod \"dns-default-jjsfd\" (UID: \"59289eca-3781-496b-9498-b1ba7c5d593e\") " pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:51.264924 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.264845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5nw2x"] Mar 12 13:38:51.267406 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:51.267380 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d3b28f_1cd7_403d_b055_e9982477c6c5.slice/crio-35e404b4127db25618ec9c41f2351b859f1da72ebf83ce3196284f17482363c4 WatchSource:0}: Error finding container 35e404b4127db25618ec9c41f2351b859f1da72ebf83ce3196284f17482363c4: Status 404 returned error can't find the container with id 35e404b4127db25618ec9c41f2351b859f1da72ebf83ce3196284f17482363c4 Mar 12 13:38:51.388517 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.388484 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9d7nr\"" Mar 12 13:38:51.396784 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.396756 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:51.404207 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.404185 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4fw8j\"" Mar 12 13:38:51.409212 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.409192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6llcj\"" Mar 12 13:38:51.411516 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.411483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:51.417305 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.417281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4jz8n" Mar 12 13:38:51.543005 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.542973 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b465c86c9-nbbq2"] Mar 12 13:38:51.546488 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:51.546458 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2278add_3ad4_46db_9278_d8a8cab50031.slice/crio-4db2fe244d1c1da2a16b987e3a9dcb7bc40ddc9fa467b668d6d50687a81e0807 WatchSource:0}: Error finding container 4db2fe244d1c1da2a16b987e3a9dcb7bc40ddc9fa467b668d6d50687a81e0807: Status 404 returned error can't find the container with id 4db2fe244d1c1da2a16b987e3a9dcb7bc40ddc9fa467b668d6d50687a81e0807 Mar 12 13:38:51.556529 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.556509 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4jz8n"] Mar 12 13:38:51.560344 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:51.560320 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb6f831_4019_43bf_9cec_d541e8e0f1dc.slice/crio-7a2ff975bd33aaa0947798618cef64562a42149b97e39c2aabead032061266b7 WatchSource:0}: Error finding container 7a2ff975bd33aaa0947798618cef64562a42149b97e39c2aabead032061266b7: Status 404 returned error can't find the container with id 7a2ff975bd33aaa0947798618cef64562a42149b97e39c2aabead032061266b7 Mar 12 13:38:51.576034 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.574455 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jjsfd"] Mar 12 13:38:51.581585 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:51.581549 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59289eca_3781_496b_9498_b1ba7c5d593e.slice/crio-99b5544f0c7fc60577a9242972e526239446f4d881cc7ed94b1265fdefc45639 WatchSource:0}: Error finding container 99b5544f0c7fc60577a9242972e526239446f4d881cc7ed94b1265fdefc45639: Status 404 returned error can't find the container with id 99b5544f0c7fc60577a9242972e526239446f4d881cc7ed94b1265fdefc45639 Mar 12 13:38:51.709879 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.709827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5nw2x" event={"ID":"35d3b28f-1cd7-403d-b055-e9982477c6c5","Type":"ContainerStarted","Data":"abb58b13b7b71a817ff84c76ca23538f0760e11d94de5058995c883c2e58358d"} Mar 12 13:38:51.709879 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.709878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5nw2x" event={"ID":"35d3b28f-1cd7-403d-b055-e9982477c6c5","Type":"ContainerStarted","Data":"35e404b4127db25618ec9c41f2351b859f1da72ebf83ce3196284f17482363c4"} Mar 12 13:38:51.710820 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.710789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4jz8n" event={"ID":"3eb6f831-4019-43bf-9cec-d541e8e0f1dc","Type":"ContainerStarted","Data":"7a2ff975bd33aaa0947798618cef64562a42149b97e39c2aabead032061266b7"} Mar 12 13:38:51.712054 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.712028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" event={"ID":"f2278add-3ad4-46db-9278-d8a8cab50031","Type":"ContainerStarted","Data":"466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a"} Mar 12 13:38:51.712169 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.712062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" event={"ID":"f2278add-3ad4-46db-9278-d8a8cab50031","Type":"ContainerStarted","Data":"4db2fe244d1c1da2a16b987e3a9dcb7bc40ddc9fa467b668d6d50687a81e0807"} Mar 12 13:38:51.712169 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.712107 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:38:51.713011 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.712994 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jjsfd" event={"ID":"59289eca-3781-496b-9498-b1ba7c5d593e","Type":"ContainerStarted","Data":"99b5544f0c7fc60577a9242972e526239446f4d881cc7ed94b1265fdefc45639"} Mar 12 13:38:51.732745 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:51.732707 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" podStartSLOduration=41.732695027 podStartE2EDuration="41.732695027s" podCreationTimestamp="2026-03-12 13:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:51.732025624 +0000 UTC m=+65.821549709" watchObservedRunningTime="2026-03-12 13:38:51.732695027 +0000 UTC m=+65.822219108" Mar 12 13:38:52.321295 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.321251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:52.321734 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.321307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:52.324912 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.324411 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 12 13:38:52.324912 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.324718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 12 13:38:52.334672 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.334464 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 12 13:38:52.344021 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.343997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e076d25a-0359-40a3-8294-d82580c2252e-metrics-certs\") pod \"network-metrics-daemon-qwv64\" (UID: \"e076d25a-0359-40a3-8294-d82580c2252e\") " pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:52.348383 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.348358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xbfm\" (UniqueName: \"kubernetes.io/projected/018363d6-b28d-4856-9451-fcf1632349aa-kube-api-access-4xbfm\") pod \"network-check-target-mms2n\" (UID: \"018363d6-b28d-4856-9451-fcf1632349aa\") " pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:52.588586 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.588564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sj9kz\"" Mar 12 13:38:52.595347 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.595160 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbttq\"" Mar 12 13:38:52.596737 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.596713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:52.601929 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.601907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwv64" Mar 12 13:38:52.718411 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.718385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5nw2x" event={"ID":"35d3b28f-1cd7-403d-b055-e9982477c6c5","Type":"ContainerStarted","Data":"b4c12e2ba055de488804573222c665c04fb0c459e2f1751c4176ca9981554f7f"} Mar 12 13:38:52.759680 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.758690 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qwv64"] Mar 12 13:38:52.763601 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:52.763574 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode076d25a_0359_40a3_8294_d82580c2252e.slice/crio-fee591c9de561a7c918c3704400b7cb2569a0c9bb59c2d3e3728ee05442c0f7f WatchSource:0}: Error finding container fee591c9de561a7c918c3704400b7cb2569a0c9bb59c2d3e3728ee05442c0f7f: Status 404 returned error can't find the container with id fee591c9de561a7c918c3704400b7cb2569a0c9bb59c2d3e3728ee05442c0f7f Mar 12 13:38:52.778680 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:52.777797 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mms2n"] Mar 12 13:38:52.781715 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:52.781690 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018363d6_b28d_4856_9451_fcf1632349aa.slice/crio-3a3d878edec26bd4fe00af30f21b879062c22eace5a7b657b12a10289478e545 WatchSource:0}: Error finding container 3a3d878edec26bd4fe00af30f21b879062c22eace5a7b657b12a10289478e545: Status 404 returned error can't find the container with id 3a3d878edec26bd4fe00af30f21b879062c22eace5a7b657b12a10289478e545 Mar 12 13:38:53.722848 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:53.722805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mms2n" event={"ID":"018363d6-b28d-4856-9451-fcf1632349aa","Type":"ContainerStarted","Data":"3a3d878edec26bd4fe00af30f21b879062c22eace5a7b657b12a10289478e545"} Mar 12 13:38:53.724125 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:53.724083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qwv64" event={"ID":"e076d25a-0359-40a3-8294-d82580c2252e","Type":"ContainerStarted","Data":"fee591c9de561a7c918c3704400b7cb2569a0c9bb59c2d3e3728ee05442c0f7f"} Mar 12 13:38:54.509935 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:54.509890 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" podUID="60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:38:54.729252 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:54.729199 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4jz8n" event={"ID":"3eb6f831-4019-43bf-9cec-d541e8e0f1dc","Type":"ContainerStarted","Data":"5896a6e7b445a410a1529cf55b236d8925e6e1ee9db94f6f8c8d28b777da0e86"} Mar 12 13:38:54.731081 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:54.731054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jjsfd" event={"ID":"59289eca-3781-496b-9498-b1ba7c5d593e","Type":"ContainerStarted","Data":"5b0a5bb8b31cd69aeaf43dfcb2a0f56579a385f30cb7ebd764a2c01b05a01684"} Mar 12 13:38:54.731204 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:54.731090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jjsfd" event={"ID":"59289eca-3781-496b-9498-b1ba7c5d593e","Type":"ContainerStarted","Data":"475d6811311a804610a5c92b4c13aadaeb7144705a3abc75c771565798ccdcbc"} Mar 12 13:38:54.731204 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:54.731188 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jjsfd" Mar 12 13:38:54.745911 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:54.745867 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4jz8n" podStartSLOduration=33.507578166 podStartE2EDuration="35.745850667s" podCreationTimestamp="2026-03-12 13:38:19 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.562080875 +0000 UTC m=+65.651604941" lastFinishedPulling="2026-03-12 13:38:53.800353357 +0000 UTC m=+67.889877442" observedRunningTime="2026-03-12 13:38:54.745169555 +0000 UTC m=+68.834693641" watchObservedRunningTime="2026-03-12 13:38:54.745850667 +0000 UTC m=+68.835374731" Mar 12 13:38:54.762702 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:54.762636 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jjsfd" podStartSLOduration=33.544166626 podStartE2EDuration="35.762620132s" podCreationTimestamp="2026-03-12 13:38:19 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.583235455 +0000 UTC m=+65.672759518" lastFinishedPulling="2026-03-12 13:38:53.801688958 +0000 UTC m=+67.891213024" observedRunningTime="2026-03-12 13:38:54.762196582 +0000 UTC m=+68.851720667" watchObservedRunningTime="2026-03-12 13:38:54.762620132 +0000 UTC m=+68.852144219" Mar 12 13:38:55.739909 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.739865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5nw2x" event={"ID":"35d3b28f-1cd7-403d-b055-e9982477c6c5","Type":"ContainerStarted","Data":"fb6a5ee0f3eaa115508a85af65b8e3a85fbb734f43ee3da2534daba7f52089a6"} Mar 12 13:38:55.741750 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.741716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qwv64" event={"ID":"e076d25a-0359-40a3-8294-d82580c2252e","Type":"ContainerStarted","Data":"ede3aa7bf637e346ec61f9605b2ab3da436a27e35be43862e28aef5427711fe7"} Mar 12 13:38:55.741888 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.741759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qwv64" event={"ID":"e076d25a-0359-40a3-8294-d82580c2252e","Type":"ContainerStarted","Data":"f32eaa2a13cd23853d0d98d4fb529d22f4c388748637bed0c6a8e9f8bb1edf54"} Mar 12 13:38:55.762122 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.762065 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5nw2x" podStartSLOduration=17.244627668 podStartE2EDuration="20.762048708s" podCreationTimestamp="2026-03-12 13:38:35 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.336152614 +0000 UTC m=+65.425676675" lastFinishedPulling="2026-03-12 13:38:54.853573641 +0000 UTC m=+68.943097715" observedRunningTime="2026-03-12 13:38:55.761038965 +0000 UTC m=+69.850563051" watchObservedRunningTime="2026-03-12 13:38:55.762048708 +0000 UTC m=+69.851572793" Mar 12 13:38:55.784140 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.784083 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qwv64" podStartSLOduration=67.744027943 podStartE2EDuration="1m9.784066473s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:38:52.765255582 +0000 UTC m=+66.854779646" lastFinishedPulling="2026-03-12 13:38:54.805294102 +0000 UTC m=+68.894818176" observedRunningTime="2026-03-12 13:38:55.783258317 +0000 UTC m=+69.872782400" watchObservedRunningTime="2026-03-12 13:38:55.784066473 +0000 UTC m=+69.873590560" Mar 12 13:38:55.925871 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.925837 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b465c86c9-nbbq2"] Mar 12 13:38:55.976197 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.976143 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57959578bc-tzqds"] Mar 12 13:38:55.996888 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.996801 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57959578bc-tzqds"] Mar 12 13:38:55.997036 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:55.996940 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.057359 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.057315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-bound-sa-token\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.057541 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.057368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-registry-certificates\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.057541 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.057403 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-trusted-ca\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.057541 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.057435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-image-registry-private-configuration\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.057541 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.057463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-installation-pull-secrets\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.057541 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.057489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-registry-tls\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.057541 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.057522 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv26g\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-kube-api-access-vv26g\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.057917 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.057556 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-ca-trust-extracted\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.158256 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.158213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-image-registry-private-configuration\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.158442 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.158267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-installation-pull-secrets\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.158442 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.158295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-registry-tls\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.158442 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.158331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv26g\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-kube-api-access-vv26g\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.158442 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.158356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-ca-trust-extracted\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.158442 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.158423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-bound-sa-token\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.158721 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.158460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-registry-certificates\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.158721 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.158520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-trusted-ca\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.159191 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.159142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-ca-trust-extracted\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.159956 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.159935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-trusted-ca\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.160145 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.160117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-registry-certificates\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.161221 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.161193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-image-registry-private-configuration\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.161323 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.161225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-installation-pull-secrets\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.161421 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.161402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-registry-tls\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.176912 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.176882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-bound-sa-token\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.177029 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.177018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv26g\" (UniqueName: \"kubernetes.io/projected/e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9-kube-api-access-vv26g\") pod \"image-registry-57959578bc-tzqds\" (UID: \"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9\") " pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.308412 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.308333 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:56.758835 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:56.758808 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57959578bc-tzqds"] Mar 12 13:38:56.762056 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:38:56.762031 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87a1b1a_115b_4b7e_8c9c_69b2fd21d9d9.slice/crio-6521601ba297fb64aeba9237070ae0e440d39806683d6ae19c29a717fcd01e35 WatchSource:0}: Error finding container 6521601ba297fb64aeba9237070ae0e440d39806683d6ae19c29a717fcd01e35: Status 404 returned error can't find the container with id 6521601ba297fb64aeba9237070ae0e440d39806683d6ae19c29a717fcd01e35 Mar 12 13:38:57.748724 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:57.748677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mms2n" event={"ID":"018363d6-b28d-4856-9451-fcf1632349aa","Type":"ContainerStarted","Data":"7e74550e97518b010b237506d783945ed9b3e5b4499576db4aa6c5bd9075bef1"} Mar 12 13:38:57.749000 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:57.748951 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:38:57.749923 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:57.749902 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57959578bc-tzqds" event={"ID":"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9","Type":"ContainerStarted","Data":"392966f0fa44affbc3be64c5a2539a0c2c1cf29d524b2a29b93e295886f170b0"} Mar 12 13:38:57.749923 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:57.749924 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57959578bc-tzqds" event={"ID":"e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9","Type":"ContainerStarted","Data":"6521601ba297fb64aeba9237070ae0e440d39806683d6ae19c29a717fcd01e35"} Mar 12 13:38:57.750063 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:57.750024 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:38:57.773103 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:57.773065 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mms2n" podStartSLOduration=67.628491571 podStartE2EDuration="1m11.773055141s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:38:52.7848034 +0000 UTC m=+66.874327470" lastFinishedPulling="2026-03-12 13:38:56.929366975 +0000 UTC m=+71.018891040" observedRunningTime="2026-03-12 13:38:57.771427474 +0000 UTC m=+71.860951581" watchObservedRunningTime="2026-03-12 13:38:57.773055141 +0000 UTC m=+71.862579225" Mar 12 13:38:57.798095 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:57.798056 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57959578bc-tzqds" podStartSLOduration=2.798046046 podStartE2EDuration="2.798046046s" podCreationTimestamp="2026-03-12 13:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:57.797540539 +0000 UTC m=+71.887064623" watchObservedRunningTime="2026-03-12 13:38:57.798046046 +0000 UTC m=+71.887570130" Mar 12 13:38:59.991746 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:59.991713 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8b6fx"] Mar 12 13:38:59.997077 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:38:59.997059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.005640 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.005621 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 12 13:39:00.006615 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.006597 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8b6fx"] Mar 12 13:39:00.087085 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.087045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a36989ce-1faa-4a64-9750-ffc5facf702b-kubelet-config\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.087085 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.087081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a36989ce-1faa-4a64-9750-ffc5facf702b-original-pull-secret\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.087265 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.087098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a36989ce-1faa-4a64-9750-ffc5facf702b-dbus\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.188090 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.188038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a36989ce-1faa-4a64-9750-ffc5facf702b-kubelet-config\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.188090 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.188088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a36989ce-1faa-4a64-9750-ffc5facf702b-original-pull-secret\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.188090 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.188107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a36989ce-1faa-4a64-9750-ffc5facf702b-dbus\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.188347 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.188175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a36989ce-1faa-4a64-9750-ffc5facf702b-kubelet-config\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.188347 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.188236 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a36989ce-1faa-4a64-9750-ffc5facf702b-dbus\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.190500 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.190479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a36989ce-1faa-4a64-9750-ffc5facf702b-original-pull-secret\") pod \"global-pull-secret-syncer-8b6fx\" (UID: \"a36989ce-1faa-4a64-9750-ffc5facf702b\") " pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.305821 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.305755 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8b6fx" Mar 12 13:39:00.434868 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.434845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8b6fx"] Mar 12 13:39:00.436760 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:39:00.436732 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda36989ce_1faa_4a64_9750_ffc5facf702b.slice/crio-86944dc728b1490898af4b73bfd3f969e72d2eb3bf466e77411fb243ffcfba2a WatchSource:0}: Error finding container 86944dc728b1490898af4b73bfd3f969e72d2eb3bf466e77411fb243ffcfba2a: Status 404 returned error can't find the container with id 86944dc728b1490898af4b73bfd3f969e72d2eb3bf466e77411fb243ffcfba2a Mar 12 13:39:00.759461 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:00.759424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8b6fx" event={"ID":"a36989ce-1faa-4a64-9750-ffc5facf702b","Type":"ContainerStarted","Data":"86944dc728b1490898af4b73bfd3f969e72d2eb3bf466e77411fb243ffcfba2a"} Mar 12 13:39:04.509960 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.509904 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" podUID="60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:39:04.510310 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.509989 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" Mar 12 13:39:04.510561 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.510529 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"9d2143ad123a3f6187cf0f2bc93a7111ca2f354376bc57e7b218f159126ee816"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" containerMessage="Container service-proxy failed liveness probe, will be restarted" Mar 12 13:39:04.510613 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.510587 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" podUID="60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2" containerName="service-proxy" containerID="cri-o://9d2143ad123a3f6187cf0f2bc93a7111ca2f354376bc57e7b218f159126ee816" gracePeriod=30 Mar 12 13:39:04.743752 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.743723 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jjsfd" Mar 12 13:39:04.772313 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.772224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8b6fx" event={"ID":"a36989ce-1faa-4a64-9750-ffc5facf702b","Type":"ContainerStarted","Data":"15240e37c102f09f26418aac959d52ceafaf6b71e579a4187abed9bd3654c43c"} Mar 12 13:39:04.774762 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.774735 2575 generic.go:358] "Generic (PLEG): container finished" podID="60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2" containerID="9d2143ad123a3f6187cf0f2bc93a7111ca2f354376bc57e7b218f159126ee816" exitCode=2 Mar 12 13:39:04.774875 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.774827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" event={"ID":"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2","Type":"ContainerDied","Data":"9d2143ad123a3f6187cf0f2bc93a7111ca2f354376bc57e7b218f159126ee816"} Mar 12 13:39:04.774875 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.774852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6fb9ff6d8b-hcp65" event={"ID":"60c6ec8d-d6b8-4014-a47d-bdf80ab81ff2","Type":"ContainerStarted","Data":"c62a59185d443452748fdc3a9dfa9825cd5b11c862ad9aacbbe702b2cb58931e"} Mar 12 13:39:04.810830 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:04.810788 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8b6fx" podStartSLOduration=2.22187521 podStartE2EDuration="5.810775612s" podCreationTimestamp="2026-03-12 13:38:59 +0000 UTC" firstStartedPulling="2026-03-12 13:39:00.438534524 +0000 UTC m=+74.528058587" lastFinishedPulling="2026-03-12 13:39:04.027434911 +0000 UTC m=+78.116958989" observedRunningTime="2026-03-12 13:39:04.788711927 +0000 UTC m=+78.878236011" watchObservedRunningTime="2026-03-12 13:39:04.810775612 +0000 UTC m=+78.900299696" Mar 12 13:39:08.821728 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.821695 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vpw9p"] Mar 12 13:39:08.826628 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.826609 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.842121 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.842098 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 12 13:39:08.842237 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.842226 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 12 13:39:08.844180 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.844158 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 12 13:39:08.844290 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.844249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-cf95r\"" Mar 12 13:39:08.848548 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.848530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 12 13:39:08.849605 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.849590 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 12 13:39:08.851176 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.851160 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 12 13:39:08.954379 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-root\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.954379 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-metrics-client-ca\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.954608 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqfdg\" (UniqueName: \"kubernetes.io/projected/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-kube-api-access-jqfdg\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.954608 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-tls\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.954608 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-accelerators-collector-config\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.954608 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-sys\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.954608 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-textfile\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.954608 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:08.954865 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:08.954619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-wtmp\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055413 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-root\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055413 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-metrics-client-ca\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055627 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqfdg\" (UniqueName: \"kubernetes.io/projected/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-kube-api-access-jqfdg\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055627 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-root\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055627 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-tls\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055812 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-accelerators-collector-config\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055812 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-sys\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055812 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-textfile\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055812 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055812 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-wtmp\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.055812 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-sys\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.056095 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.055943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-wtmp\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.056187 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.056169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-metrics-client-ca\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.056249 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.056184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-textfile\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.056415 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.056376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-accelerators-collector-config\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.058617 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.058590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-tls\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.058746 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.058678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.063925 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.063902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqfdg\" (UniqueName: \"kubernetes.io/projected/d7e4b839-6cf6-4c5c-a29c-34cc8c85e230-kube-api-access-jqfdg\") pod \"node-exporter-vpw9p\" (UID: \"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230\") " pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.138538 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.138509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vpw9p" Mar 12 13:39:09.148593 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:39:09.148559 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e4b839_6cf6_4c5c_a29c_34cc8c85e230.slice/crio-ec4a4d337caf5ea14c4dbaab5f7faff3daf28d7677f9ce519362b787e6859c13 WatchSource:0}: Error finding container ec4a4d337caf5ea14c4dbaab5f7faff3daf28d7677f9ce519362b787e6859c13: Status 404 returned error can't find the container with id ec4a4d337caf5ea14c4dbaab5f7faff3daf28d7677f9ce519362b787e6859c13 Mar 12 13:39:09.793795 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:09.793759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vpw9p" event={"ID":"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230","Type":"ContainerStarted","Data":"ec4a4d337caf5ea14c4dbaab5f7faff3daf28d7677f9ce519362b787e6859c13"} Mar 12 13:39:10.797408 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:10.797377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vpw9p" event={"ID":"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230","Type":"ContainerStarted","Data":"dc6f6ed6d5595981d09ea23f11b27b89dca291e3ab861cf6d5049359d940cd4c"} Mar 12 13:39:11.801114 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:11.801077 2575 generic.go:358] "Generic (PLEG): container finished" podID="d7e4b839-6cf6-4c5c-a29c-34cc8c85e230" containerID="dc6f6ed6d5595981d09ea23f11b27b89dca291e3ab861cf6d5049359d940cd4c" exitCode=0 Mar 12 13:39:11.801546 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:11.801122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vpw9p" event={"ID":"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230","Type":"ContainerDied","Data":"dc6f6ed6d5595981d09ea23f11b27b89dca291e3ab861cf6d5049359d940cd4c"} Mar 12 13:39:12.804991 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:12.804952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vpw9p" event={"ID":"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230","Type":"ContainerStarted","Data":"0f285f6d201cad8e1a5b1778742da59d0116dc841f749cdf27eb1ec637e60bc7"} Mar 12 13:39:12.804991 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:12.804989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vpw9p" event={"ID":"d7e4b839-6cf6-4c5c-a29c-34cc8c85e230","Type":"ContainerStarted","Data":"18d2fb5e1577f09ff03876dde1b00898f6cf70d981a8ee5295af78857f660e40"} Mar 12 13:39:12.826475 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:12.826431 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vpw9p" podStartSLOduration=3.366417842 podStartE2EDuration="4.826415833s" podCreationTimestamp="2026-03-12 13:39:08 +0000 UTC" firstStartedPulling="2026-03-12 13:39:09.150554487 +0000 UTC m=+83.240078548" lastFinishedPulling="2026-03-12 13:39:10.610552476 +0000 UTC m=+84.700076539" observedRunningTime="2026-03-12 13:39:12.825187818 +0000 UTC m=+86.914711914" watchObservedRunningTime="2026-03-12 13:39:12.826415833 +0000 UTC m=+86.915939920" Mar 12 13:39:15.932168 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.932139 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:39:15.982404 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.982367 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57cb56fb88-fjnwg"] Mar 12 13:39:15.986982 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.986960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:15.989990 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.989972 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 12 13:39:15.989990 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.989987 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 12 13:39:15.990522 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.990507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 12 13:39:15.990739 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.990721 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 12 13:39:15.990833 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.990746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mksw2\"" Mar 12 13:39:15.990833 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.990791 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 12 13:39:15.991275 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.991262 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 12 13:39:15.992162 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.992145 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 12 13:39:15.997903 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.997874 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 12 13:39:15.999382 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:15.999363 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57cb56fb88-fjnwg"] Mar 12 13:39:16.111106 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.111074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2kc9\" (UniqueName: \"kubernetes.io/projected/8bd558aa-d655-4382-980d-2a16b2adaafd-kube-api-access-q2kc9\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.111259 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.111124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-service-ca\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.111259 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.111167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-trusted-ca-bundle\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.111259 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.111204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-serving-cert\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.111259 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.111225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-oauth-serving-cert\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.111428 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.111265 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-oauth-config\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.111428 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.111293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-console-config\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.211844 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.211766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-trusted-ca-bundle\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.211844 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.211810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-serving-cert\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.211844 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.211830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-oauth-serving-cert\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.211844 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.211850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-oauth-config\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.212125 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.211867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-console-config\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.212125 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.211901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2kc9\" (UniqueName: \"kubernetes.io/projected/8bd558aa-d655-4382-980d-2a16b2adaafd-kube-api-access-q2kc9\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.212125 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.211929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-service-ca\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.212769 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.212745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-console-config\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.212868 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.212806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-service-ca\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.212868 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.212828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-trusted-ca-bundle\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.212945 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.212876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-oauth-serving-cert\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.214440 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.214414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-serving-cert\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.214528 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.214417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-oauth-config\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.222939 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.222916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2kc9\" (UniqueName: \"kubernetes.io/projected/8bd558aa-d655-4382-980d-2a16b2adaafd-kube-api-access-q2kc9\") pod \"console-57cb56fb88-fjnwg\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.296199 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.296179 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:16.418418 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.418390 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57cb56fb88-fjnwg"] Mar 12 13:39:16.421033 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:39:16.421011 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bd558aa_d655_4382_980d_2a16b2adaafd.slice/crio-4cde91e245159a7ba9ff87ec6539a1b4ae66541dbc0739f34389480a61484a0d WatchSource:0}: Error finding container 4cde91e245159a7ba9ff87ec6539a1b4ae66541dbc0739f34389480a61484a0d: Status 404 returned error can't find the container with id 4cde91e245159a7ba9ff87ec6539a1b4ae66541dbc0739f34389480a61484a0d Mar 12 13:39:16.819945 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:16.819908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cb56fb88-fjnwg" event={"ID":"8bd558aa-d655-4382-980d-2a16b2adaafd","Type":"ContainerStarted","Data":"4cde91e245159a7ba9ff87ec6539a1b4ae66541dbc0739f34389480a61484a0d"} Mar 12 13:39:18.757636 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:18.757605 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57959578bc-tzqds" Mar 12 13:39:19.829581 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:19.829540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cb56fb88-fjnwg" event={"ID":"8bd558aa-d655-4382-980d-2a16b2adaafd","Type":"ContainerStarted","Data":"3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21"} Mar 12 13:39:19.849306 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:19.849227 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57cb56fb88-fjnwg" podStartSLOduration=1.942972969 podStartE2EDuration="4.849212131s" podCreationTimestamp="2026-03-12 13:39:15 +0000 UTC" firstStartedPulling="2026-03-12 13:39:16.422992757 +0000 UTC m=+90.512516822" lastFinishedPulling="2026-03-12 13:39:19.329231919 +0000 UTC m=+93.418755984" observedRunningTime="2026-03-12 13:39:19.848199484 +0000 UTC m=+93.937723568" watchObservedRunningTime="2026-03-12 13:39:19.849212131 +0000 UTC m=+93.938736216" Mar 12 13:39:20.947415 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:20.947375 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" podUID="f2278add-3ad4-46db-9278-d8a8cab50031" containerName="registry" containerID="cri-o://466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a" gracePeriod=30 Mar 12 13:39:21.179432 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.179411 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:39:21.251505 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251435 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdgkr\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-kube-api-access-gdgkr\") pod \"f2278add-3ad4-46db-9278-d8a8cab50031\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " Mar 12 13:39:21.251505 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251494 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-registry-certificates\") pod \"f2278add-3ad4-46db-9278-d8a8cab50031\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " Mar 12 13:39:21.251723 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251535 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-trusted-ca\") pod \"f2278add-3ad4-46db-9278-d8a8cab50031\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " Mar 12 13:39:21.251723 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251564 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") pod \"f2278add-3ad4-46db-9278-d8a8cab50031\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " Mar 12 13:39:21.251723 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251592 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-image-registry-private-configuration\") pod \"f2278add-3ad4-46db-9278-d8a8cab50031\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " Mar 12 13:39:21.251723 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251630 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-bound-sa-token\") pod \"f2278add-3ad4-46db-9278-d8a8cab50031\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " Mar 12 13:39:21.251723 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251689 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2278add-3ad4-46db-9278-d8a8cab50031-ca-trust-extracted\") pod \"f2278add-3ad4-46db-9278-d8a8cab50031\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " Mar 12 13:39:21.251723 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251719 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-installation-pull-secrets\") pod \"f2278add-3ad4-46db-9278-d8a8cab50031\" (UID: \"f2278add-3ad4-46db-9278-d8a8cab50031\") " Mar 12 13:39:21.252013 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251983 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f2278add-3ad4-46db-9278-d8a8cab50031" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:39:21.252013 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.251997 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f2278add-3ad4-46db-9278-d8a8cab50031" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:39:21.254321 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.254242 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-kube-api-access-gdgkr" (OuterVolumeSpecName: "kube-api-access-gdgkr") pod "f2278add-3ad4-46db-9278-d8a8cab50031" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031"). InnerVolumeSpecName "kube-api-access-gdgkr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:39:21.254321 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.254284 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f2278add-3ad4-46db-9278-d8a8cab50031" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:39:21.254632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.254428 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f2278add-3ad4-46db-9278-d8a8cab50031" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:39:21.254632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.254467 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f2278add-3ad4-46db-9278-d8a8cab50031" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:39:21.254632 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.254579 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f2278add-3ad4-46db-9278-d8a8cab50031" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:39:21.263201 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.263174 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2278add-3ad4-46db-9278-d8a8cab50031-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f2278add-3ad4-46db-9278-d8a8cab50031" (UID: "f2278add-3ad4-46db-9278-d8a8cab50031"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 13:39:21.352738 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.352709 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-installation-pull-secrets\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:39:21.352738 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.352734 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gdgkr\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-kube-api-access-gdgkr\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:39:21.352738 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.352743 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-registry-certificates\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:39:21.352911 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.352752 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2278add-3ad4-46db-9278-d8a8cab50031-trusted-ca\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:39:21.352911 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.352761 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-registry-tls\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:39:21.352911 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.352769 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f2278add-3ad4-46db-9278-d8a8cab50031-image-registry-private-configuration\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:39:21.352911 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.352778 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2278add-3ad4-46db-9278-d8a8cab50031-bound-sa-token\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:39:21.352911 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.352788 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2278add-3ad4-46db-9278-d8a8cab50031-ca-trust-extracted\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:39:21.834959 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.834922 2575 generic.go:358] "Generic (PLEG): container finished" podID="f2278add-3ad4-46db-9278-d8a8cab50031" containerID="466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a" exitCode=0 Mar 12 13:39:21.835127 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.834978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" event={"ID":"f2278add-3ad4-46db-9278-d8a8cab50031","Type":"ContainerDied","Data":"466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a"} Mar 12 13:39:21.835127 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.834990 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" Mar 12 13:39:21.835127 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.835011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b465c86c9-nbbq2" event={"ID":"f2278add-3ad4-46db-9278-d8a8cab50031","Type":"ContainerDied","Data":"4db2fe244d1c1da2a16b987e3a9dcb7bc40ddc9fa467b668d6d50687a81e0807"} Mar 12 13:39:21.835127 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.835046 2575 scope.go:117] "RemoveContainer" containerID="466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a" Mar 12 13:39:21.843568 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.843551 2575 scope.go:117] "RemoveContainer" containerID="466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a" Mar 12 13:39:21.843831 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:39:21.843813 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a\": container with ID starting with 466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a not found: ID does not exist" containerID="466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a" Mar 12 13:39:21.843886 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.843840 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a"} err="failed to get container status \"466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a\": rpc error: code = NotFound desc = could not find container \"466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a\": container with ID starting with 466860bf154fe07f1e735347a0f93dfe29190c1358e02b104fc9a720e557ac2a not found: ID does not exist" Mar 12 13:39:21.856818 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.856799 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b465c86c9-nbbq2"] Mar 12 13:39:21.862893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:21.862873 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6b465c86c9-nbbq2"] Mar 12 13:39:22.474685 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:22.474635 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2278add-3ad4-46db-9278-d8a8cab50031" path="/var/lib/kubelet/pods/f2278add-3ad4-46db-9278-d8a8cab50031/volumes" Mar 12 13:39:26.296500 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:26.296460 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:26.296500 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:26.296508 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:26.301506 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:26.301484 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:26.852310 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:26.852284 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:39:28.755165 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:28.755139 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mms2n" Mar 12 13:39:31.301071 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.301034 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f8b4d584-lnscm"] Mar 12 13:39:31.301436 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.301276 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2278add-3ad4-46db-9278-d8a8cab50031" containerName="registry" Mar 12 13:39:31.301436 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.301286 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2278add-3ad4-46db-9278-d8a8cab50031" containerName="registry" Mar 12 13:39:31.301436 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.301325 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2278add-3ad4-46db-9278-d8a8cab50031" containerName="registry" Mar 12 13:39:31.306071 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.306053 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.320056 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.320033 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f8b4d584-lnscm"] Mar 12 13:39:31.427593 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.427564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-service-ca\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.427752 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.427603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-oauth-serving-cert\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.427752 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.427634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72q9z\" (UniqueName: \"kubernetes.io/projected/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-kube-api-access-72q9z\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.427752 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.427721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-oauth-config\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.427752 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.427740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-config\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.427893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.427770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-serving-cert\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.427893 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.427801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-trusted-ca-bundle\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.528693 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.528644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-service-ca\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.528840 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.528702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-oauth-serving-cert\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.528840 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.528731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72q9z\" (UniqueName: \"kubernetes.io/projected/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-kube-api-access-72q9z\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.528840 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.528751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-oauth-config\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.528840 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.528765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-config\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.528840 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.528793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-serving-cert\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.529635 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.529109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-trusted-ca-bundle\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.529635 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.529356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-service-ca\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.529635 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.529573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-oauth-serving-cert\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.529635 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.529588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-config\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.529986 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.529972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-trusted-ca-bundle\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.531951 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.531929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-serving-cert\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.532074 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.531989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-oauth-config\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.538867 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.538845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72q9z\" (UniqueName: \"kubernetes.io/projected/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-kube-api-access-72q9z\") pod \"console-5f8b4d584-lnscm\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.614667 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.614629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:31.758463 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.758433 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f8b4d584-lnscm"] Mar 12 13:39:31.762360 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:39:31.762323 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f7ba65b_fe31_4cb9_bd3e_60343bbd0616.slice/crio-590b541339646a8e57782bdfc546b0fc5f0c2a20e24d467528fa8bfb523ba07b WatchSource:0}: Error finding container 590b541339646a8e57782bdfc546b0fc5f0c2a20e24d467528fa8bfb523ba07b: Status 404 returned error can't find the container with id 590b541339646a8e57782bdfc546b0fc5f0c2a20e24d467528fa8bfb523ba07b Mar 12 13:39:31.863809 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.863767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8b4d584-lnscm" event={"ID":"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616","Type":"ContainerStarted","Data":"2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7"} Mar 12 13:39:31.863809 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.863812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8b4d584-lnscm" event={"ID":"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616","Type":"ContainerStarted","Data":"590b541339646a8e57782bdfc546b0fc5f0c2a20e24d467528fa8bfb523ba07b"} Mar 12 13:39:31.907372 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:31.907287 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f8b4d584-lnscm" podStartSLOduration=0.907272391 podStartE2EDuration="907.272391ms" podCreationTimestamp="2026-03-12 13:39:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:39:31.905099943 +0000 UTC m=+105.994624027" watchObservedRunningTime="2026-03-12 13:39:31.907272391 +0000 UTC m=+105.996796474" Mar 12 13:39:41.615536 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:41.615496 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:41.615536 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:41.615542 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:41.620396 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:41.620375 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:41.892803 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:41.892727 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:39:41.946076 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:41.946045 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57cb56fb88-fjnwg"] Mar 12 13:39:46.293417 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:39:46.293383 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4jz8n_3eb6f831-4019-43bf-9cec-d541e8e0f1dc/serve-healthcheck-canary/0.log" Mar 12 13:40:06.965484 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:06.965428 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57cb56fb88-fjnwg" podUID="8bd558aa-d655-4382-980d-2a16b2adaafd" containerName="console" containerID="cri-o://3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21" gracePeriod=15 Mar 12 13:40:07.198383 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.198362 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57cb56fb88-fjnwg_8bd558aa-d655-4382-980d-2a16b2adaafd/console/0.log" Mar 12 13:40:07.198496 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.198422 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:40:07.295022 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.294943 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-service-ca\") pod \"8bd558aa-d655-4382-980d-2a16b2adaafd\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " Mar 12 13:40:07.295022 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.294978 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-trusted-ca-bundle\") pod \"8bd558aa-d655-4382-980d-2a16b2adaafd\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " Mar 12 13:40:07.295022 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295000 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2kc9\" (UniqueName: \"kubernetes.io/projected/8bd558aa-d655-4382-980d-2a16b2adaafd-kube-api-access-q2kc9\") pod \"8bd558aa-d655-4382-980d-2a16b2adaafd\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " Mar 12 13:40:07.295022 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295019 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-serving-cert\") pod \"8bd558aa-d655-4382-980d-2a16b2adaafd\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " Mar 12 13:40:07.295328 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295034 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-oauth-config\") pod \"8bd558aa-d655-4382-980d-2a16b2adaafd\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " Mar 12 13:40:07.295328 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295062 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-oauth-serving-cert\") pod \"8bd558aa-d655-4382-980d-2a16b2adaafd\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " Mar 12 13:40:07.295328 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295122 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-console-config\") pod \"8bd558aa-d655-4382-980d-2a16b2adaafd\" (UID: \"8bd558aa-d655-4382-980d-2a16b2adaafd\") " Mar 12 13:40:07.295471 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295443 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-service-ca" (OuterVolumeSpecName: "service-ca") pod "8bd558aa-d655-4382-980d-2a16b2adaafd" (UID: "8bd558aa-d655-4382-980d-2a16b2adaafd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:07.295515 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295486 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8bd558aa-d655-4382-980d-2a16b2adaafd" (UID: "8bd558aa-d655-4382-980d-2a16b2adaafd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:07.295612 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295589 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8bd558aa-d655-4382-980d-2a16b2adaafd" (UID: "8bd558aa-d655-4382-980d-2a16b2adaafd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:07.295788 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.295596 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-console-config" (OuterVolumeSpecName: "console-config") pod "8bd558aa-d655-4382-980d-2a16b2adaafd" (UID: "8bd558aa-d655-4382-980d-2a16b2adaafd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:07.297302 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.297278 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8bd558aa-d655-4382-980d-2a16b2adaafd" (UID: "8bd558aa-d655-4382-980d-2a16b2adaafd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:07.297379 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.297336 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd558aa-d655-4382-980d-2a16b2adaafd-kube-api-access-q2kc9" (OuterVolumeSpecName: "kube-api-access-q2kc9") pod "8bd558aa-d655-4382-980d-2a16b2adaafd" (UID: "8bd558aa-d655-4382-980d-2a16b2adaafd"). InnerVolumeSpecName "kube-api-access-q2kc9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:40:07.297379 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.297359 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8bd558aa-d655-4382-980d-2a16b2adaafd" (UID: "8bd558aa-d655-4382-980d-2a16b2adaafd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:07.395934 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.395906 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-oauth-serving-cert\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:40:07.395934 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.395931 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-console-config\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:40:07.395934 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.395940 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-service-ca\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:40:07.396113 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.395949 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bd558aa-d655-4382-980d-2a16b2adaafd-trusted-ca-bundle\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:40:07.396113 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.395958 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q2kc9\" (UniqueName: \"kubernetes.io/projected/8bd558aa-d655-4382-980d-2a16b2adaafd-kube-api-access-q2kc9\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:40:07.396113 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.395967 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-serving-cert\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:40:07.396113 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.395977 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bd558aa-d655-4382-980d-2a16b2adaafd-console-oauth-config\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:40:07.960169 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.960141 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57cb56fb88-fjnwg_8bd558aa-d655-4382-980d-2a16b2adaafd/console/0.log" Mar 12 13:40:07.960338 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.960188 2575 generic.go:358] "Generic (PLEG): container finished" podID="8bd558aa-d655-4382-980d-2a16b2adaafd" containerID="3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21" exitCode=2 Mar 12 13:40:07.960338 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.960222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cb56fb88-fjnwg" event={"ID":"8bd558aa-d655-4382-980d-2a16b2adaafd","Type":"ContainerDied","Data":"3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21"} Mar 12 13:40:07.960338 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.960258 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cb56fb88-fjnwg" Mar 12 13:40:07.960338 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.960272 2575 scope.go:117] "RemoveContainer" containerID="3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21" Mar 12 13:40:07.960485 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.960262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cb56fb88-fjnwg" event={"ID":"8bd558aa-d655-4382-980d-2a16b2adaafd","Type":"ContainerDied","Data":"4cde91e245159a7ba9ff87ec6539a1b4ae66541dbc0739f34389480a61484a0d"} Mar 12 13:40:07.969021 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.968826 2575 scope.go:117] "RemoveContainer" containerID="3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21" Mar 12 13:40:07.969225 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:40:07.969085 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21\": container with ID starting with 3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21 not found: ID does not exist" containerID="3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21" Mar 12 13:40:07.969225 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.969119 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21"} err="failed to get container status \"3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21\": rpc error: code = NotFound desc = could not find container \"3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21\": container with ID starting with 3987274642e8d25045ba74f4c8c49f7a35cbf9c777b1802db205f5207ea96c21 not found: ID does not exist" Mar 12 13:40:07.982608 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.982585 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57cb56fb88-fjnwg"] Mar 12 13:40:07.987185 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:07.987168 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57cb56fb88-fjnwg"] Mar 12 13:40:08.475413 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:08.475382 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd558aa-d655-4382-980d-2a16b2adaafd" path="/var/lib/kubelet/pods/8bd558aa-d655-4382-980d-2a16b2adaafd/volumes" Mar 12 13:40:30.221326 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.221291 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f7586fd48-wjstm"] Mar 12 13:40:30.221775 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.221668 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bd558aa-d655-4382-980d-2a16b2adaafd" containerName="console" Mar 12 13:40:30.221775 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.221683 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd558aa-d655-4382-980d-2a16b2adaafd" containerName="console" Mar 12 13:40:30.221775 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.221737 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bd558aa-d655-4382-980d-2a16b2adaafd" containerName="console" Mar 12 13:40:30.225959 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.225943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.234213 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.234190 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f7586fd48-wjstm"] Mar 12 13:40:30.355582 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.355543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-oauth-config\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.355582 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.355581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-serving-cert\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.355873 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.355603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-console-config\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.355873 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.355694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kptd8\" (UniqueName: \"kubernetes.io/projected/2250dd18-cc97-4942-a4c0-7f870470cde9-kube-api-access-kptd8\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.355873 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.355781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-oauth-serving-cert\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.355873 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.355828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-service-ca\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.355873 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.355869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-trusted-ca-bundle\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.457061 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.457027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kptd8\" (UniqueName: \"kubernetes.io/projected/2250dd18-cc97-4942-a4c0-7f870470cde9-kube-api-access-kptd8\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.457188 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.457077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-oauth-serving-cert\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.457188 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.457096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-service-ca\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.457188 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.457119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-trusted-ca-bundle\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.457188 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.457136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-oauth-config\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.457188 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.457153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-serving-cert\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.457188 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.457171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-console-config\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.457906 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.457877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-oauth-serving-cert\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.458027 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.458002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-console-config\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.458092 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.458037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-trusted-ca-bundle\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.458167 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.458146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-service-ca\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.459814 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.459777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-oauth-config\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.459943 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.459927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-serving-cert\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.466008 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.465987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kptd8\" (UniqueName: \"kubernetes.io/projected/2250dd18-cc97-4942-a4c0-7f870470cde9-kube-api-access-kptd8\") pod \"console-f7586fd48-wjstm\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.535965 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.535903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:30.656679 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:30.656612 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f7586fd48-wjstm"] Mar 12 13:40:30.660775 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:40:30.660749 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2250dd18_cc97_4942_a4c0_7f870470cde9.slice/crio-f0995d2270590c5b1e62f9b95adc2b7a8e841154f8729107dbf700ee4e05be7d WatchSource:0}: Error finding container f0995d2270590c5b1e62f9b95adc2b7a8e841154f8729107dbf700ee4e05be7d: Status 404 returned error can't find the container with id f0995d2270590c5b1e62f9b95adc2b7a8e841154f8729107dbf700ee4e05be7d Mar 12 13:40:31.024169 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:31.024135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f7586fd48-wjstm" event={"ID":"2250dd18-cc97-4942-a4c0-7f870470cde9","Type":"ContainerStarted","Data":"6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc"} Mar 12 13:40:31.024169 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:31.024169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f7586fd48-wjstm" event={"ID":"2250dd18-cc97-4942-a4c0-7f870470cde9","Type":"ContainerStarted","Data":"f0995d2270590c5b1e62f9b95adc2b7a8e841154f8729107dbf700ee4e05be7d"} Mar 12 13:40:31.048007 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:31.047960 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f7586fd48-wjstm" podStartSLOduration=1.047946163 podStartE2EDuration="1.047946163s" podCreationTimestamp="2026-03-12 13:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:40:31.046990339 +0000 UTC m=+165.136514434" watchObservedRunningTime="2026-03-12 13:40:31.047946163 +0000 UTC m=+165.137470246" Mar 12 13:40:40.536566 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:40.536467 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:40.537016 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:40.536571 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:40.541269 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:40.541247 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:41.054307 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:41.054276 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:40:41.104927 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:40:41.104897 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f8b4d584-lnscm"] Mar 12 13:41:06.124218 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.124151 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f8b4d584-lnscm" podUID="8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" containerName="console" containerID="cri-o://2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7" gracePeriod=15 Mar 12 13:41:06.370718 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.370694 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f8b4d584-lnscm_8f7ba65b-fe31-4cb9-bd3e-60343bbd0616/console/0.log" Mar 12 13:41:06.370826 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.370754 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:41:06.528593 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.528502 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72q9z\" (UniqueName: \"kubernetes.io/projected/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-kube-api-access-72q9z\") pod \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " Mar 12 13:41:06.528593 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.528543 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-config\") pod \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " Mar 12 13:41:06.528593 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.528590 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-service-ca\") pod \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " Mar 12 13:41:06.528906 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.528701 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-oauth-serving-cert\") pod \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " Mar 12 13:41:06.528906 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.528743 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-oauth-config\") pod \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " Mar 12 13:41:06.528906 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.528759 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-serving-cert\") pod \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " Mar 12 13:41:06.528906 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.528787 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-trusted-ca-bundle\") pod \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\" (UID: \"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616\") " Mar 12 13:41:06.529139 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.529031 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-config" (OuterVolumeSpecName: "console-config") pod "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" (UID: "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:41:06.529139 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.529108 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" (UID: "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:41:06.529294 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.529206 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-service-ca" (OuterVolumeSpecName: "service-ca") pod "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" (UID: "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:41:06.529294 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.529240 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" (UID: "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:41:06.531112 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.531081 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" (UID: "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:41:06.531447 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.531424 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" (UID: "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:41:06.531508 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.531443 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-kube-api-access-72q9z" (OuterVolumeSpecName: "kube-api-access-72q9z") pod "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" (UID: "8f7ba65b-fe31-4cb9-bd3e-60343bbd0616"). InnerVolumeSpecName "kube-api-access-72q9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:41:06.630184 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.630153 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-service-ca\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:41:06.630184 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.630179 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-oauth-serving-cert\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:41:06.630184 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.630188 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-oauth-config\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:41:06.630399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.630198 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-serving-cert\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:41:06.630399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.630207 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-trusted-ca-bundle\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:41:06.630399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.630216 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72q9z\" (UniqueName: \"kubernetes.io/projected/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-kube-api-access-72q9z\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:41:06.630399 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:06.630226 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616-console-config\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:41:07.116914 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.116887 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f8b4d584-lnscm_8f7ba65b-fe31-4cb9-bd3e-60343bbd0616/console/0.log" Mar 12 13:41:07.117074 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.116931 2575 generic.go:358] "Generic (PLEG): container finished" podID="8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" containerID="2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7" exitCode=2 Mar 12 13:41:07.117074 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.116977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8b4d584-lnscm" event={"ID":"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616","Type":"ContainerDied","Data":"2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7"} Mar 12 13:41:07.117074 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.117002 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8b4d584-lnscm" Mar 12 13:41:07.117074 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.117018 2575 scope.go:117] "RemoveContainer" containerID="2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7" Mar 12 13:41:07.117244 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.117005 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8b4d584-lnscm" event={"ID":"8f7ba65b-fe31-4cb9-bd3e-60343bbd0616","Type":"ContainerDied","Data":"590b541339646a8e57782bdfc546b0fc5f0c2a20e24d467528fa8bfb523ba07b"} Mar 12 13:41:07.125283 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.125123 2575 scope.go:117] "RemoveContainer" containerID="2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7" Mar 12 13:41:07.125501 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:41:07.125361 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7\": container with ID starting with 2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7 not found: ID does not exist" containerID="2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7" Mar 12 13:41:07.125501 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.125384 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7"} err="failed to get container status \"2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7\": rpc error: code = NotFound desc = could not find container \"2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7\": container with ID starting with 2a494119462ee212ea37e0fe1296ecc60efa913334712225a6bf74e070e0d8e7 not found: ID does not exist" Mar 12 13:41:07.138746 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.138726 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f8b4d584-lnscm"] Mar 12 13:41:07.144077 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:07.144055 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f8b4d584-lnscm"] Mar 12 13:41:08.474601 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:41:08.474570 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" path="/var/lib/kubelet/pods/8f7ba65b-fe31-4cb9-bd3e-60343bbd0616/volumes" Mar 12 13:42:46.411645 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:42:46.411607 2575 kubelet.go:1628] "Image garbage collection succeeded" Mar 12 13:44:33.888955 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.888925 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86d5bb5598-7frnq"] Mar 12 13:44:33.889407 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.889184 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" containerName="console" Mar 12 13:44:33.889407 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.889196 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" containerName="console" Mar 12 13:44:33.889407 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.889237 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f7ba65b-fe31-4cb9-bd3e-60343bbd0616" containerName="console" Mar 12 13:44:33.891918 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.891902 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:33.924168 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.924144 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d5bb5598-7frnq"] Mar 12 13:44:33.993397 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.993368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-oauth-serving-cert\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:33.993561 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.993414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-service-ca\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:33.993561 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.993433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6tc4\" (UniqueName: \"kubernetes.io/projected/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-kube-api-access-f6tc4\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:33.993561 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.993452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-trusted-ca-bundle\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:33.993561 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.993526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-serving-cert\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:33.993760 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.993568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-oauth-config\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:33.993760 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:33.993620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-config\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.094722 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.094681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-serving-cert\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.094918 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.094732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-oauth-config\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.094918 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.094765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-config\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.094918 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.094795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-oauth-serving-cert\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.094918 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.094851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-service-ca\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.094918 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.094874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6tc4\" (UniqueName: \"kubernetes.io/projected/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-kube-api-access-f6tc4\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.094918 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.094898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-trusted-ca-bundle\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.095563 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.095537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-config\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.095815 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.095788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-trusted-ca-bundle\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.095920 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.095896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-oauth-serving-cert\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.095974 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.095952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-service-ca\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.097437 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.097417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-serving-cert\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.097527 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.097457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-console-oauth-config\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.104088 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.104069 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6tc4\" (UniqueName: \"kubernetes.io/projected/3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f-kube-api-access-f6tc4\") pod \"console-86d5bb5598-7frnq\" (UID: \"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f\") " pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.200991 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.200898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:34.323877 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.323849 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d5bb5598-7frnq"] Mar 12 13:44:34.327905 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:44:34.327869 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e0cfd26_6a5b_4e62_b48e_a4ed8612d32f.slice/crio-1c7158ccc0c248db6489a1044bd96dbdff6a2577299c0826033d127fe74f109f WatchSource:0}: Error finding container 1c7158ccc0c248db6489a1044bd96dbdff6a2577299c0826033d127fe74f109f: Status 404 returned error can't find the container with id 1c7158ccc0c248db6489a1044bd96dbdff6a2577299c0826033d127fe74f109f Mar 12 13:44:34.329551 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.329535 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:44:34.642160 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.642123 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d5bb5598-7frnq" event={"ID":"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f","Type":"ContainerStarted","Data":"964b1d82559e27c6151b070d3b2179e71334e7073271585d176f5983dba95169"} Mar 12 13:44:34.642160 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.642164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d5bb5598-7frnq" event={"ID":"3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f","Type":"ContainerStarted","Data":"1c7158ccc0c248db6489a1044bd96dbdff6a2577299c0826033d127fe74f109f"} Mar 12 13:44:34.664881 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:34.664835 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86d5bb5598-7frnq" podStartSLOduration=1.664823173 podStartE2EDuration="1.664823173s" podCreationTimestamp="2026-03-12 13:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:44:34.663907736 +0000 UTC m=+408.753431820" watchObservedRunningTime="2026-03-12 13:44:34.664823173 +0000 UTC m=+408.754347257" Mar 12 13:44:44.202072 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:44.201952 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:44.202072 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:44.202019 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:44.206841 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:44.206816 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:44.670032 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:44.670002 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86d5bb5598-7frnq" Mar 12 13:44:44.722927 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:44:44.721861 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f7586fd48-wjstm"] Mar 12 13:45:09.744183 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:09.744092 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f7586fd48-wjstm" podUID="2250dd18-cc97-4942-a4c0-7f870470cde9" containerName="console" containerID="cri-o://6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc" gracePeriod=15 Mar 12 13:45:09.976588 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:09.976567 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f7586fd48-wjstm_2250dd18-cc97-4942-a4c0-7f870470cde9/console/0.log" Mar 12 13:45:09.976732 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:09.976626 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:45:10.044339 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044269 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-service-ca\") pod \"2250dd18-cc97-4942-a4c0-7f870470cde9\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " Mar 12 13:45:10.044339 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044305 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-console-config\") pod \"2250dd18-cc97-4942-a4c0-7f870470cde9\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " Mar 12 13:45:10.044339 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044332 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-trusted-ca-bundle\") pod \"2250dd18-cc97-4942-a4c0-7f870470cde9\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " Mar 12 13:45:10.044569 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044370 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-oauth-config\") pod \"2250dd18-cc97-4942-a4c0-7f870470cde9\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " Mar 12 13:45:10.044569 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044393 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-serving-cert\") pod \"2250dd18-cc97-4942-a4c0-7f870470cde9\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " Mar 12 13:45:10.044569 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044436 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kptd8\" (UniqueName: \"kubernetes.io/projected/2250dd18-cc97-4942-a4c0-7f870470cde9-kube-api-access-kptd8\") pod \"2250dd18-cc97-4942-a4c0-7f870470cde9\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " Mar 12 13:45:10.044569 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044475 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-oauth-serving-cert\") pod \"2250dd18-cc97-4942-a4c0-7f870470cde9\" (UID: \"2250dd18-cc97-4942-a4c0-7f870470cde9\") " Mar 12 13:45:10.044790 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044737 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-console-config" (OuterVolumeSpecName: "console-config") pod "2250dd18-cc97-4942-a4c0-7f870470cde9" (UID: "2250dd18-cc97-4942-a4c0-7f870470cde9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:45:10.044851 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044680 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-service-ca" (OuterVolumeSpecName: "service-ca") pod "2250dd18-cc97-4942-a4c0-7f870470cde9" (UID: "2250dd18-cc97-4942-a4c0-7f870470cde9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:45:10.044957 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.044935 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2250dd18-cc97-4942-a4c0-7f870470cde9" (UID: "2250dd18-cc97-4942-a4c0-7f870470cde9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:45:10.045027 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.045004 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2250dd18-cc97-4942-a4c0-7f870470cde9" (UID: "2250dd18-cc97-4942-a4c0-7f870470cde9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:45:10.046798 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.046778 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2250dd18-cc97-4942-a4c0-7f870470cde9" (UID: "2250dd18-cc97-4942-a4c0-7f870470cde9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:45:10.047110 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.047082 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2250dd18-cc97-4942-a4c0-7f870470cde9" (UID: "2250dd18-cc97-4942-a4c0-7f870470cde9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:45:10.047183 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.047107 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2250dd18-cc97-4942-a4c0-7f870470cde9-kube-api-access-kptd8" (OuterVolumeSpecName: "kube-api-access-kptd8") pod "2250dd18-cc97-4942-a4c0-7f870470cde9" (UID: "2250dd18-cc97-4942-a4c0-7f870470cde9"). InnerVolumeSpecName "kube-api-access-kptd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:45:10.145513 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.145485 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-service-ca\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:45:10.145513 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.145510 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-console-config\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:45:10.145691 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.145519 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-trusted-ca-bundle\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:45:10.145691 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.145529 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-oauth-config\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:45:10.145691 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.145538 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2250dd18-cc97-4942-a4c0-7f870470cde9-console-serving-cert\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:45:10.145691 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.145549 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kptd8\" (UniqueName: \"kubernetes.io/projected/2250dd18-cc97-4942-a4c0-7f870470cde9-kube-api-access-kptd8\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:45:10.145691 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.145558 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2250dd18-cc97-4942-a4c0-7f870470cde9-oauth-serving-cert\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:45:10.733634 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.733560 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f7586fd48-wjstm_2250dd18-cc97-4942-a4c0-7f870470cde9/console/0.log" Mar 12 13:45:10.733634 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.733599 2575 generic.go:358] "Generic (PLEG): container finished" podID="2250dd18-cc97-4942-a4c0-7f870470cde9" containerID="6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc" exitCode=2 Mar 12 13:45:10.733821 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.733680 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f7586fd48-wjstm" Mar 12 13:45:10.733821 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.733692 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f7586fd48-wjstm" event={"ID":"2250dd18-cc97-4942-a4c0-7f870470cde9","Type":"ContainerDied","Data":"6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc"} Mar 12 13:45:10.733821 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.733732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f7586fd48-wjstm" event={"ID":"2250dd18-cc97-4942-a4c0-7f870470cde9","Type":"ContainerDied","Data":"f0995d2270590c5b1e62f9b95adc2b7a8e841154f8729107dbf700ee4e05be7d"} Mar 12 13:45:10.733821 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.733762 2575 scope.go:117] "RemoveContainer" containerID="6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc" Mar 12 13:45:10.741767 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.741750 2575 scope.go:117] "RemoveContainer" containerID="6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc" Mar 12 13:45:10.742016 ip-10-0-139-20 kubenswrapper[2575]: E0312 13:45:10.741999 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc\": container with ID starting with 6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc not found: ID does not exist" containerID="6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc" Mar 12 13:45:10.742065 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.742025 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc"} err="failed to get container status \"6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc\": rpc error: code = NotFound desc = could not find container \"6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc\": container with ID starting with 6dbae87509a8af8ee58e2c5b3e9ea8af760c206251d8d8461001664422139fcc not found: ID does not exist" Mar 12 13:45:10.753696 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.753670 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f7586fd48-wjstm"] Mar 12 13:45:10.757176 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:10.757156 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f7586fd48-wjstm"] Mar 12 13:45:12.475070 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:45:12.475035 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2250dd18-cc97-4942-a4c0-7f870470cde9" path="/var/lib/kubelet/pods/2250dd18-cc97-4942-a4c0-7f870470cde9/volumes" Mar 12 13:57:34.500898 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.500825 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z"] Mar 12 13:57:34.501332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.501054 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2250dd18-cc97-4942-a4c0-7f870470cde9" containerName="console" Mar 12 13:57:34.501332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.501064 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2250dd18-cc97-4942-a4c0-7f870470cde9" containerName="console" Mar 12 13:57:34.501332 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.501111 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2250dd18-cc97-4942-a4c0-7f870470cde9" containerName="console" Mar 12 13:57:34.503855 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.503839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:57:34.506704 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.506684 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-nlrws\"/\"default-dockercfg-bn65w\"" Mar 12 13:57:34.506811 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.506704 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"openshift-service-ca.crt\"" Mar 12 13:57:34.506811 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.506730 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"kube-root-ca.crt\"" Mar 12 13:57:34.512912 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.512891 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z"] Mar 12 13:57:34.645486 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.645430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz44s\" (UniqueName: \"kubernetes.io/projected/d2f498ae-50c5-476e-a83f-7395299223df-kube-api-access-cz44s\") pod \"progression-custom-config-node-0-0-4lh6z\" (UID: \"d2f498ae-50c5-476e-a83f-7395299223df\") " pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:57:34.746670 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.746626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz44s\" (UniqueName: \"kubernetes.io/projected/d2f498ae-50c5-476e-a83f-7395299223df-kube-api-access-cz44s\") pod \"progression-custom-config-node-0-0-4lh6z\" (UID: \"d2f498ae-50c5-476e-a83f-7395299223df\") " pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:57:34.755267 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.755201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz44s\" (UniqueName: \"kubernetes.io/projected/d2f498ae-50c5-476e-a83f-7395299223df-kube-api-access-cz44s\") pod \"progression-custom-config-node-0-0-4lh6z\" (UID: \"d2f498ae-50c5-476e-a83f-7395299223df\") " pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:57:34.813026 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.813002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:57:34.940028 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.937409 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z"] Mar 12 13:57:34.953047 ip-10-0-139-20 kubenswrapper[2575]: W0312 13:57:34.953018 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f498ae_50c5_476e_a83f_7395299223df.slice/crio-b3cb6a38163cae087e8f0b274e7a11e8a4d1645332a2a70dee90c002304e02c3 WatchSource:0}: Error finding container b3cb6a38163cae087e8f0b274e7a11e8a4d1645332a2a70dee90c002304e02c3: Status 404 returned error can't find the container with id b3cb6a38163cae087e8f0b274e7a11e8a4d1645332a2a70dee90c002304e02c3 Mar 12 13:57:34.955059 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:34.955044 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:57:35.663965 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:57:35.663906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" event={"ID":"d2f498ae-50c5-476e-a83f-7395299223df","Type":"ContainerStarted","Data":"b3cb6a38163cae087e8f0b274e7a11e8a4d1645332a2a70dee90c002304e02c3"} Mar 12 13:59:16.949604 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:16.949566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" event={"ID":"d2f498ae-50c5-476e-a83f-7395299223df","Type":"ContainerStarted","Data":"66d160b17f09788a2162010835c4de65785c06dac51713ae5a8ca7ed8d367e06"} Mar 12 13:59:16.950037 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:16.949692 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:59:16.973180 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:16.973133 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" podStartSLOduration=1.192215419 podStartE2EDuration="1m42.973120649s" podCreationTimestamp="2026-03-12 13:57:34 +0000 UTC" firstStartedPulling="2026-03-12 13:57:34.955163282 +0000 UTC m=+1189.044687346" lastFinishedPulling="2026-03-12 13:59:16.736068511 +0000 UTC m=+1290.825592576" observedRunningTime="2026-03-12 13:59:16.973101005 +0000 UTC m=+1291.062625088" watchObservedRunningTime="2026-03-12 13:59:16.973120649 +0000 UTC m=+1291.062644732" Mar 12 13:59:18.960404 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:18.960372 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:59:32.958912 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:32.958867 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" podUID="d2f498ae-50c5-476e-a83f-7395299223df" containerName="node" probeResult="failure" output="Get \"http://10.133.0.20:28080/metrics\": dial tcp 10.133.0.20:28080: connect: connection refused" Mar 12 13:59:33.958674 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:33.958612 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" podUID="d2f498ae-50c5-476e-a83f-7395299223df" containerName="node" probeResult="failure" output="Get \"http://10.133.0.20:28080/metrics\": dial tcp 10.133.0.20:28080: connect: connection refused" Mar 12 13:59:33.958824 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:33.958748 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:59:33.959200 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:33.959169 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" podUID="d2f498ae-50c5-476e-a83f-7395299223df" containerName="node" probeResult="failure" output="Get \"http://10.133.0.20:28080/metrics\": dial tcp 10.133.0.20:28080: connect: connection refused" Mar 12 13:59:34.001206 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:34.001177 2575 generic.go:358] "Generic (PLEG): container finished" podID="d2f498ae-50c5-476e-a83f-7395299223df" containerID="66d160b17f09788a2162010835c4de65785c06dac51713ae5a8ca7ed8d367e06" exitCode=0 Mar 12 13:59:34.001362 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:34.001243 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" event={"ID":"d2f498ae-50c5-476e-a83f-7395299223df","Type":"ContainerDied","Data":"66d160b17f09788a2162010835c4de65785c06dac51713ae5a8ca7ed8d367e06"} Mar 12 13:59:35.131766 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:35.131734 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:59:35.226847 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:35.226812 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz44s\" (UniqueName: \"kubernetes.io/projected/d2f498ae-50c5-476e-a83f-7395299223df-kube-api-access-cz44s\") pod \"d2f498ae-50c5-476e-a83f-7395299223df\" (UID: \"d2f498ae-50c5-476e-a83f-7395299223df\") " Mar 12 13:59:35.229192 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:35.229158 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f498ae-50c5-476e-a83f-7395299223df-kube-api-access-cz44s" (OuterVolumeSpecName: "kube-api-access-cz44s") pod "d2f498ae-50c5-476e-a83f-7395299223df" (UID: "d2f498ae-50c5-476e-a83f-7395299223df"). InnerVolumeSpecName "kube-api-access-cz44s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:59:35.327696 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:35.327671 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cz44s\" (UniqueName: \"kubernetes.io/projected/d2f498ae-50c5-476e-a83f-7395299223df-kube-api-access-cz44s\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 13:59:36.006986 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:36.006952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" event={"ID":"d2f498ae-50c5-476e-a83f-7395299223df","Type":"ContainerDied","Data":"b3cb6a38163cae087e8f0b274e7a11e8a4d1645332a2a70dee90c002304e02c3"} Mar 12 13:59:36.006986 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:36.006983 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z" Mar 12 13:59:36.007186 ip-10-0-139-20 kubenswrapper[2575]: I0312 13:59:36.006987 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3cb6a38163cae087e8f0b274e7a11e8a4d1645332a2a70dee90c002304e02c3" Mar 12 14:09:39.201869 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.201779 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs"] Mar 12 14:09:39.202430 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.202119 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2f498ae-50c5-476e-a83f-7395299223df" containerName="node" Mar 12 14:09:39.202430 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.202138 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f498ae-50c5-476e-a83f-7395299223df" containerName="node" Mar 12 14:09:39.202430 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.202210 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2f498ae-50c5-476e-a83f-7395299223df" containerName="node" Mar 12 14:09:39.205039 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.205019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:39.208662 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.208624 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"kube-root-ca.crt\"" Mar 12 14:09:39.208750 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.208677 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"openshift-service-ca.crt\"" Mar 12 14:09:39.210047 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.210025 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-nlrws\"/\"default-dockercfg-bn65w\"" Mar 12 14:09:39.214824 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.214803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs"] Mar 12 14:09:39.305292 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.305256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mgfk\" (UniqueName: \"kubernetes.io/projected/48fcb66a-d730-4817-bd83-fd3cd38515b6-kube-api-access-4mgfk\") pod \"no-prestop-hook-node-0-0-gpjbs\" (UID: \"48fcb66a-d730-4817-bd83-fd3cd38515b6\") " pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:39.405896 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.405861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mgfk\" (UniqueName: \"kubernetes.io/projected/48fcb66a-d730-4817-bd83-fd3cd38515b6-kube-api-access-4mgfk\") pod \"no-prestop-hook-node-0-0-gpjbs\" (UID: \"48fcb66a-d730-4817-bd83-fd3cd38515b6\") " pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:39.415428 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.415403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mgfk\" (UniqueName: \"kubernetes.io/projected/48fcb66a-d730-4817-bd83-fd3cd38515b6-kube-api-access-4mgfk\") pod \"no-prestop-hook-node-0-0-gpjbs\" (UID: \"48fcb66a-d730-4817-bd83-fd3cd38515b6\") " pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:39.514710 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.514630 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:39.637956 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.637845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs"] Mar 12 14:09:39.640451 ip-10-0-139-20 kubenswrapper[2575]: W0312 14:09:39.640422 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fcb66a_d730_4817_bd83_fd3cd38515b6.slice/crio-e4cf128abbf00bdc3a68a461af28894d410ecd5d6d816655d217790057d4eb25 WatchSource:0}: Error finding container e4cf128abbf00bdc3a68a461af28894d410ecd5d6d816655d217790057d4eb25: Status 404 returned error can't find the container with id e4cf128abbf00bdc3a68a461af28894d410ecd5d6d816655d217790057d4eb25 Mar 12 14:09:39.642415 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:39.642401 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:09:40.596825 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:40.596789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" event={"ID":"48fcb66a-d730-4817-bd83-fd3cd38515b6","Type":"ContainerStarted","Data":"e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d"} Mar 12 14:09:40.596825 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:40.596824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" event={"ID":"48fcb66a-d730-4817-bd83-fd3cd38515b6","Type":"ContainerStarted","Data":"e4cf128abbf00bdc3a68a461af28894d410ecd5d6d816655d217790057d4eb25"} Mar 12 14:09:40.597340 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:40.596917 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:40.617119 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:40.617076 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" podStartSLOduration=1.617062707 podStartE2EDuration="1.617062707s" podCreationTimestamp="2026-03-12 14:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:09:40.615682356 +0000 UTC m=+1914.705206437" watchObservedRunningTime="2026-03-12 14:09:40.617062707 +0000 UTC m=+1914.706586792" Mar 12 14:09:41.600318 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:41.600287 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:48.440836 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:48.440802 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs"] Mar 12 14:09:48.441339 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:48.441008 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" podUID="48fcb66a-d730-4817-bd83-fd3cd38515b6" containerName="node" containerID="cri-o://e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d" gracePeriod=30 Mar 12 14:09:48.445682 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:48.445633 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z"] Mar 12 14:09:48.448376 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:48.448355 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-nlrws/progression-custom-config-node-0-0-4lh6z"] Mar 12 14:09:48.475160 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:48.475134 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f498ae-50c5-476e-a83f-7395299223df" path="/var/lib/kubelet/pods/d2f498ae-50c5-476e-a83f-7395299223df/volumes" Mar 12 14:09:55.703499 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:55.703449 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" podUID="48fcb66a-d730-4817-bd83-fd3cd38515b6" containerName="node" probeResult="failure" output="Get \"http://10.133.0.21:28080/metrics\": read tcp 10.133.0.2:56658->10.133.0.21:28080: read: connection reset by peer" Mar 12 14:09:55.875809 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:55.875787 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:55.910516 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:55.910491 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mgfk\" (UniqueName: \"kubernetes.io/projected/48fcb66a-d730-4817-bd83-fd3cd38515b6-kube-api-access-4mgfk\") pod \"48fcb66a-d730-4817-bd83-fd3cd38515b6\" (UID: \"48fcb66a-d730-4817-bd83-fd3cd38515b6\") " Mar 12 14:09:55.912572 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:55.912550 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fcb66a-d730-4817-bd83-fd3cd38515b6-kube-api-access-4mgfk" (OuterVolumeSpecName: "kube-api-access-4mgfk") pod "48fcb66a-d730-4817-bd83-fd3cd38515b6" (UID: "48fcb66a-d730-4817-bd83-fd3cd38515b6"). InnerVolumeSpecName "kube-api-access-4mgfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 14:09:56.011124 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.011063 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mgfk\" (UniqueName: \"kubernetes.io/projected/48fcb66a-d730-4817-bd83-fd3cd38515b6-kube-api-access-4mgfk\") on node \"ip-10-0-139-20.ec2.internal\" DevicePath \"\"" Mar 12 14:09:56.639184 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.639144 2575 generic.go:358] "Generic (PLEG): container finished" podID="48fcb66a-d730-4817-bd83-fd3cd38515b6" containerID="e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d" exitCode=0 Mar 12 14:09:56.639370 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.639232 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" Mar 12 14:09:56.639370 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.639234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" event={"ID":"48fcb66a-d730-4817-bd83-fd3cd38515b6","Type":"ContainerDied","Data":"e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d"} Mar 12 14:09:56.639370 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.639285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs" event={"ID":"48fcb66a-d730-4817-bd83-fd3cd38515b6","Type":"ContainerDied","Data":"e4cf128abbf00bdc3a68a461af28894d410ecd5d6d816655d217790057d4eb25"} Mar 12 14:09:56.639370 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.639306 2575 scope.go:117] "RemoveContainer" containerID="e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d" Mar 12 14:09:56.647051 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.647032 2575 scope.go:117] "RemoveContainer" containerID="e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d" Mar 12 14:09:56.647308 ip-10-0-139-20 kubenswrapper[2575]: E0312 14:09:56.647285 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d\": container with ID starting with e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d not found: ID does not exist" containerID="e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d" Mar 12 14:09:56.647397 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.647317 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d"} err="failed to get container status \"e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d\": rpc error: code = NotFound desc = could not find container \"e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d\": container with ID starting with e0e6dbabd31d4d47699a681b304dff5f89670797096cc10fadb38cce056c7b9d not found: ID does not exist" Mar 12 14:09:56.656813 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.656793 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs"] Mar 12 14:09:56.660700 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:56.660673 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-nlrws/no-prestop-hook-node-0-0-gpjbs"] Mar 12 14:09:58.474908 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:09:58.474872 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fcb66a-d730-4817-bd83-fd3cd38515b6" path="/var/lib/kubelet/pods/48fcb66a-d730-4817-bd83-fd3cd38515b6/volumes" Mar 12 14:10:38.290518 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.290444 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nssb5/must-gather-hdq6b"] Mar 12 14:10:38.290987 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.290718 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fcb66a-d730-4817-bd83-fd3cd38515b6" containerName="node" Mar 12 14:10:38.290987 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.290728 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fcb66a-d730-4817-bd83-fd3cd38515b6" containerName="node" Mar 12 14:10:38.290987 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.290771 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="48fcb66a-d730-4817-bd83-fd3cd38515b6" containerName="node" Mar 12 14:10:38.293538 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.293519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nssb5/must-gather-hdq6b" Mar 12 14:10:38.296332 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.296308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nssb5\"/\"kube-root-ca.crt\"" Mar 12 14:10:38.297588 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.297568 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nssb5\"/\"default-dockercfg-5x56g\"" Mar 12 14:10:38.297697 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.297590 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nssb5\"/\"openshift-service-ca.crt\"" Mar 12 14:10:38.299979 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.299959 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nssb5/must-gather-hdq6b"] Mar 12 14:10:38.407497 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.407465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc59c\" (UniqueName: \"kubernetes.io/projected/f50e0457-94d2-4d1f-9459-1e813cecced3-kube-api-access-hc59c\") pod \"must-gather-hdq6b\" (UID: \"f50e0457-94d2-4d1f-9459-1e813cecced3\") " pod="openshift-must-gather-nssb5/must-gather-hdq6b" Mar 12 14:10:38.407689 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.407519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f50e0457-94d2-4d1f-9459-1e813cecced3-must-gather-output\") pod \"must-gather-hdq6b\" (UID: \"f50e0457-94d2-4d1f-9459-1e813cecced3\") " pod="openshift-must-gather-nssb5/must-gather-hdq6b" Mar 12 14:10:38.508611 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.508576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f50e0457-94d2-4d1f-9459-1e813cecced3-must-gather-output\") pod \"must-gather-hdq6b\" (UID: \"f50e0457-94d2-4d1f-9459-1e813cecced3\") " pod="openshift-must-gather-nssb5/must-gather-hdq6b" Mar 12 14:10:38.508770 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.508638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc59c\" (UniqueName: \"kubernetes.io/projected/f50e0457-94d2-4d1f-9459-1e813cecced3-kube-api-access-hc59c\") pod \"must-gather-hdq6b\" (UID: \"f50e0457-94d2-4d1f-9459-1e813cecced3\") " pod="openshift-must-gather-nssb5/must-gather-hdq6b" Mar 12 14:10:38.508993 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.508974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f50e0457-94d2-4d1f-9459-1e813cecced3-must-gather-output\") pod \"must-gather-hdq6b\" (UID: \"f50e0457-94d2-4d1f-9459-1e813cecced3\") " pod="openshift-must-gather-nssb5/must-gather-hdq6b" Mar 12 14:10:38.517342 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.517318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc59c\" (UniqueName: \"kubernetes.io/projected/f50e0457-94d2-4d1f-9459-1e813cecced3-kube-api-access-hc59c\") pod \"must-gather-hdq6b\" (UID: \"f50e0457-94d2-4d1f-9459-1e813cecced3\") " pod="openshift-must-gather-nssb5/must-gather-hdq6b" Mar 12 14:10:38.603584 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.603563 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nssb5/must-gather-hdq6b" Mar 12 14:10:38.740882 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.740849 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nssb5/must-gather-hdq6b"] Mar 12 14:10:38.744401 ip-10-0-139-20 kubenswrapper[2575]: W0312 14:10:38.744373 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf50e0457_94d2_4d1f_9459_1e813cecced3.slice/crio-358da002b74f73d02901c7c37f3134306ff031c92f79a6360b0549927bff3b38 WatchSource:0}: Error finding container 358da002b74f73d02901c7c37f3134306ff031c92f79a6360b0549927bff3b38: Status 404 returned error can't find the container with id 358da002b74f73d02901c7c37f3134306ff031c92f79a6360b0549927bff3b38 Mar 12 14:10:38.751150 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:38.751110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nssb5/must-gather-hdq6b" event={"ID":"f50e0457-94d2-4d1f-9459-1e813cecced3","Type":"ContainerStarted","Data":"358da002b74f73d02901c7c37f3134306ff031c92f79a6360b0549927bff3b38"} Mar 12 14:10:40.757921 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:40.757878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nssb5/must-gather-hdq6b" event={"ID":"f50e0457-94d2-4d1f-9459-1e813cecced3","Type":"ContainerStarted","Data":"8e43dbb9702677cd604ab0234099b943849ea418da653aed34ab32a12cd31ed1"} Mar 12 14:10:40.757921 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:40.757926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nssb5/must-gather-hdq6b" event={"ID":"f50e0457-94d2-4d1f-9459-1e813cecced3","Type":"ContainerStarted","Data":"a2e2b26bb5e6a33a4770670b97b4460d897afebe9965ba23d93aad5fb4cc4c3d"} Mar 12 14:10:40.773070 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:40.773015 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nssb5/must-gather-hdq6b" podStartSLOduration=1.861434694 podStartE2EDuration="2.772998575s" podCreationTimestamp="2026-03-12 14:10:38 +0000 UTC" firstStartedPulling="2026-03-12 14:10:38.746503588 +0000 UTC m=+1972.836027650" lastFinishedPulling="2026-03-12 14:10:39.658067467 +0000 UTC m=+1973.747591531" observedRunningTime="2026-03-12 14:10:40.772122988 +0000 UTC m=+1974.861647074" watchObservedRunningTime="2026-03-12 14:10:40.772998575 +0000 UTC m=+1974.862522661" Mar 12 14:10:41.027341 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:41.027269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8b6fx_a36989ce-1faa-4a64-9750-ffc5facf702b/global-pull-secret-syncer/0.log" Mar 12 14:10:41.208940 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:41.208911 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w2psw_4e3e43f3-29b4-45df-8953-0095e22a0d55/konnectivity-agent/0.log" Mar 12 14:10:41.259157 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:41.259133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-20.ec2.internal_31268ff111c3b6a3e6157c28dc36e73e/haproxy/0.log" Mar 12 14:10:44.543759 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:44.543720 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vpw9p_d7e4b839-6cf6-4c5c-a29c-34cc8c85e230/node-exporter/0.log" Mar 12 14:10:44.568505 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:44.568482 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vpw9p_d7e4b839-6cf6-4c5c-a29c-34cc8c85e230/kube-rbac-proxy/0.log" Mar 12 14:10:44.592013 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:44.591971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vpw9p_d7e4b839-6cf6-4c5c-a29c-34cc8c85e230/init-textfile/0.log" Mar 12 14:10:46.535463 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:46.535431 2575 scope.go:117] "RemoveContainer" containerID="66d160b17f09788a2162010835c4de65785c06dac51713ae5a8ca7ed8d367e06" Mar 12 14:10:47.273545 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.273519 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86d5bb5598-7frnq_3e0cfd26-6a5b-4e62-b48e-a4ed8612d32f/console/0.log" Mar 12 14:10:47.710670 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.710618 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59"] Mar 12 14:10:47.715184 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.715158 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.723764 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.723728 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59"] Mar 12 14:10:47.788366 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.788332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-lib-modules\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.788717 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.788693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjlk\" (UniqueName: \"kubernetes.io/projected/29bbd091-8a2e-404b-a079-b444315e97bc-kube-api-access-bjjlk\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.788966 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.788948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-proc\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.789127 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.789113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-podres\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.789318 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.789271 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-sys\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.890825 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.890791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-proc\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.891052 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.891025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-proc\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.891223 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.891184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-podres\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.891302 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.891047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-podres\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.891454 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.891423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-sys\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.891608 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.891581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-lib-modules\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.891608 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.891486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-sys\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.891767 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.891619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjlk\" (UniqueName: \"kubernetes.io/projected/29bbd091-8a2e-404b-a079-b444315e97bc-kube-api-access-bjjlk\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.891767 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.891698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29bbd091-8a2e-404b-a079-b444315e97bc-lib-modules\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:47.901376 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:47.901351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjlk\" (UniqueName: \"kubernetes.io/projected/29bbd091-8a2e-404b-a079-b444315e97bc-kube-api-access-bjjlk\") pod \"perf-node-gather-daemonset-dzs59\" (UID: \"29bbd091-8a2e-404b-a079-b444315e97bc\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:48.026717 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.026622 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:48.176258 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.176216 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59"] Mar 12 14:10:48.179140 ip-10-0-139-20 kubenswrapper[2575]: W0312 14:10:48.179110 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod29bbd091_8a2e_404b_a079_b444315e97bc.slice/crio-fe8984185a101b59afa420e071bb9d7968ca9216223d0f1a7c834e769ea0e846 WatchSource:0}: Error finding container fe8984185a101b59afa420e071bb9d7968ca9216223d0f1a7c834e769ea0e846: Status 404 returned error can't find the container with id fe8984185a101b59afa420e071bb9d7968ca9216223d0f1a7c834e769ea0e846 Mar 12 14:10:48.456007 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.455980 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jjsfd_59289eca-3781-496b-9498-b1ba7c5d593e/dns/0.log" Mar 12 14:10:48.476383 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.476358 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jjsfd_59289eca-3781-496b-9498-b1ba7c5d593e/kube-rbac-proxy/0.log" Mar 12 14:10:48.544408 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.544386 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-blv4t_ab59b1c8-0dc2-45d5-aea6-a91ec018f894/dns-node-resolver/0.log" Mar 12 14:10:48.791173 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.791097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" event={"ID":"29bbd091-8a2e-404b-a079-b444315e97bc","Type":"ContainerStarted","Data":"677749fe5063cb461d972030a9e0de6253548c9f92a9e91448c8d0d0ad6d158b"} Mar 12 14:10:48.791173 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.791133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" event={"ID":"29bbd091-8a2e-404b-a079-b444315e97bc","Type":"ContainerStarted","Data":"fe8984185a101b59afa420e071bb9d7968ca9216223d0f1a7c834e769ea0e846"} Mar 12 14:10:48.791568 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.791247 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:48.809308 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:48.809253 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" podStartSLOduration=1.809234194 podStartE2EDuration="1.809234194s" podCreationTimestamp="2026-03-12 14:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:10:48.808134319 +0000 UTC m=+1982.897658404" watchObservedRunningTime="2026-03-12 14:10:48.809234194 +0000 UTC m=+1982.898758279" Mar 12 14:10:49.014496 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:49.014468 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-57959578bc-tzqds_e87a1b1a-115b-4b7e-8c9c-69b2fd21d9d9/registry/0.log" Mar 12 14:10:49.032979 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:49.032949 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g64nq_fc8195c5-3667-46e7-8bca-1b80b2d9943d/node-ca/0.log" Mar 12 14:10:50.109997 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:50.109970 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4jz8n_3eb6f831-4019-43bf-9cec-d541e8e0f1dc/serve-healthcheck-canary/0.log" Mar 12 14:10:50.567665 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:50.567578 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5nw2x_35d3b28f-1cd7-403d-b055-e9982477c6c5/kube-rbac-proxy/0.log" Mar 12 14:10:50.589989 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:50.589964 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5nw2x_35d3b28f-1cd7-403d-b055-e9982477c6c5/exporter/0.log" Mar 12 14:10:50.617596 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:50.617575 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5nw2x_35d3b28f-1cd7-403d-b055-e9982477c6c5/extractor/0.log" Mar 12 14:10:54.808469 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:54.807852 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-dzs59" Mar 12 14:10:55.681251 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:55.681222 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-ftzcs_6819031a-6b93-42f2-b7d5-28fc80fafb35/migrator/0.log" Mar 12 14:10:55.704637 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:55.704611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-ftzcs_6819031a-6b93-42f2-b7d5-28fc80fafb35/graceful-termination/0.log" Mar 12 14:10:57.119450 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.119420 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jt74_43d2e0f6-060c-4389-9a1f-5bdb06198e7b/kube-multus/0.log" Mar 12 14:10:57.502440 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.502414 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qq8v5_c07aa00c-e596-44da-b75d-f3772a7057fd/kube-multus-additional-cni-plugins/0.log" Mar 12 14:10:57.522589 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.522562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qq8v5_c07aa00c-e596-44da-b75d-f3772a7057fd/egress-router-binary-copy/0.log" Mar 12 14:10:57.541958 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.541928 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qq8v5_c07aa00c-e596-44da-b75d-f3772a7057fd/cni-plugins/0.log" Mar 12 14:10:57.563089 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.563073 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qq8v5_c07aa00c-e596-44da-b75d-f3772a7057fd/bond-cni-plugin/0.log" Mar 12 14:10:57.582431 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.582409 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qq8v5_c07aa00c-e596-44da-b75d-f3772a7057fd/routeoverride-cni/0.log" Mar 12 14:10:57.601945 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.601895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qq8v5_c07aa00c-e596-44da-b75d-f3772a7057fd/whereabouts-cni-bincopy/0.log" Mar 12 14:10:57.723621 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.723589 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qq8v5_c07aa00c-e596-44da-b75d-f3772a7057fd/whereabouts-cni/0.log" Mar 12 14:10:57.819087 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.819062 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qwv64_e076d25a-0359-40a3-8294-d82580c2252e/network-metrics-daemon/0.log" Mar 12 14:10:57.838389 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:57.838359 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qwv64_e076d25a-0359-40a3-8294-d82580c2252e/kube-rbac-proxy/0.log" Mar 12 14:10:59.189857 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:59.189823 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-plcmr_24428774-0c1d-4253-a9b0-384ed1b79796/ovn-controller/0.log" Mar 12 14:10:59.220032 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:59.220008 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-plcmr_24428774-0c1d-4253-a9b0-384ed1b79796/ovn-acl-logging/0.log" Mar 12 14:10:59.238058 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:59.238031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-plcmr_24428774-0c1d-4253-a9b0-384ed1b79796/kube-rbac-proxy-node/0.log" Mar 12 14:10:59.259612 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:59.259591 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-plcmr_24428774-0c1d-4253-a9b0-384ed1b79796/kube-rbac-proxy-ovn-metrics/0.log" Mar 12 14:10:59.283869 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:59.283845 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-plcmr_24428774-0c1d-4253-a9b0-384ed1b79796/northd/0.log" Mar 12 14:10:59.306622 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:59.306597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-plcmr_24428774-0c1d-4253-a9b0-384ed1b79796/nbdb/0.log" Mar 12 14:10:59.326802 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:59.326778 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-plcmr_24428774-0c1d-4253-a9b0-384ed1b79796/sbdb/0.log" Mar 12 14:10:59.427217 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:10:59.427185 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-plcmr_24428774-0c1d-4253-a9b0-384ed1b79796/ovnkube-controller/0.log" Mar 12 14:11:00.419330 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:11:00.419227 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mms2n_018363d6-b28d-4856-9451-fcf1632349aa/network-check-target-container/0.log" Mar 12 14:11:01.299375 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:11:01.299348 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4rzbn_7812c8ef-b633-4d0e-bdb1-683d0a4b9dd6/iptables-alerter/0.log" Mar 12 14:11:02.025161 ip-10-0-139-20 kubenswrapper[2575]: I0312 14:11:02.025132 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-qjdg7_4a7d4073-afc5-478a-8838-a78fa193f1bd/tuned/0.log"